Apr 21 14:53:12.613554 ip-10-0-129-133 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 14:53:12.613568 ip-10-0-129-133 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 14:53:12.613578 ip-10-0-129-133 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 14:53:12.613911 ip-10-0-129-133 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 14:53:22.663453 ip-10-0-129-133 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 14:53:22.663469 ip-10-0-129-133 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a56921f3ac8347e3a0d113bebb00c865 -- Apr 21 14:55:35.888869 ip-10-0-129-133 systemd[1]: Starting Kubernetes Kubelet... Apr 21 14:55:36.356283 ip-10-0-129-133 kubenswrapper[2610]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:36.356283 ip-10-0-129-133 kubenswrapper[2610]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 14:55:36.356283 ip-10-0-129-133 kubenswrapper[2610]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:36.356283 ip-10-0-129-133 kubenswrapper[2610]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 14:55:36.356283 ip-10-0-129-133 kubenswrapper[2610]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:36.358013 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.357924 2610 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 14:55:36.362015 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.361999 2610 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:36.362015 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362015 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362019 2610 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362022 2610 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362025 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362028 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362032 2610 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362034 2610 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362037 2610 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362040 2610 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362043 2610 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362045 2610 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362048 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362050 2610 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362052 2610 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362055 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362057 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362060 2610 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362063 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362066 2610 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362068 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:36.362077 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362076 2610 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362079 2610 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362082 2610 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362085 2610 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362088 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362091 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362094 2610 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362097 2610 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362099 2610 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362102 2610 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362104 2610 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362107 2610 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362109 2610 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362112 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362114 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362116 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362119 2610 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362121 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362124 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362126 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:36.362542 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362129 2610 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362131 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362134 2610 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362137 2610 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362139 2610 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362143 2610 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362146 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362148 2610 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362151 2610 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362154 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362156 2610 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362159 2610 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362161 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362164 2610 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362167 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362169 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362172 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362174 2610 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362179 2610 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362183 2610 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:36.363081 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362186 2610 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362189 2610 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362192 2610 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362194 2610 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362198 2610 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362202 2610 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362206 2610 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362208 2610 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362211 2610 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362213 2610 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362216 2610 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362218 2610 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362221 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362223 2610 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362226 2610 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362230 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362232 2610 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362236 2610 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362239 2610 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:36.363561 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362241 2610 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362244 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362246 2610 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362249 2610 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362252 2610 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362254 2610 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362663 2610 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362670 2610 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362672 2610 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362675 2610 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362678 2610 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362680 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362684 2610 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362686 2610 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362689 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362691 2610 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362694 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362697 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362699 2610 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:36.364067 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362702 2610 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362704 2610 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362707 2610 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362710 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362714 2610 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362718 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362721 2610 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362723 2610 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362727 2610 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362729 2610 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362732 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362734 2610 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362737 2610 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362739 2610 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362741 2610 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362744 2610 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362746 2610 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362749 2610 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362751 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362753 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:36.364514 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362756 2610 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362765 2610 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362768 2610 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362770 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362773 2610 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362775 2610 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362778 2610 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362781 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362783 2610 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362785 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362788 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362791 2610 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362793 2610 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362796 2610 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362799 2610 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362803 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362806 2610 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362808 2610 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362811 2610 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:36.365019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362813 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362816 2610 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362819 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362822 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362824 2610 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362826 2610 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362829 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362831 2610 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362834 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362836 2610 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362839 2610 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362841 2610 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362843 2610 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362846 2610 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362849 2610 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362851 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362854 2610 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362857 2610 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362860 2610 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362862 2610 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:36.365481 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362864 2610 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362867 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362869 2610 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362872 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362874 2610 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362877 2610 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362879 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362881 2610 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362884 2610 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362887 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362890 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362892 2610 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362895 2610 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.362897 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.362970 2610 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.362978 2610 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.362985 2610 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.362990 2610 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.362994 2610 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.362997 2610 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363002 2610 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 14:55:36.365983 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363007 2610 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363010 2610 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363013 2610 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363017 2610 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363021 2610 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363024 2610 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363027 2610 flags.go:64] FLAG: --cgroup-root="" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363030 2610 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363033 2610 flags.go:64] FLAG: --client-ca-file="" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363036 2610 flags.go:64] FLAG: --cloud-config="" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363038 2610 flags.go:64] FLAG: --cloud-provider="external" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363041 2610 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363045 2610 flags.go:64] FLAG: --cluster-domain="" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363048 2610 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363051 2610 flags.go:64] FLAG: --config-dir="" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363054 2610 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363057 2610 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363061 2610 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363064 2610 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363071 2610 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363074 2610 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363077 2610 flags.go:64] FLAG: --contention-profiling="false" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363080 2610 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363083 2610 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363086 2610 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 14:55:36.366570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363089 2610 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363094 2610 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363097 2610 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363100 2610 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363103 2610 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363106 2610 flags.go:64] FLAG: --enable-server="true" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363109 2610 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363115 2610 flags.go:64] FLAG: --event-burst="100" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363118 2610 flags.go:64] FLAG: --event-qps="50" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363120 2610 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363123 2610 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363126 2610 flags.go:64] FLAG: --eviction-hard="" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363130 2610 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363133 2610 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363136 2610 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363139 2610 flags.go:64] FLAG: --eviction-soft="" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363142 2610 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363144 2610 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363147 2610 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363150 2610 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363153 2610 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363155 2610 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363158 2610 flags.go:64] FLAG: --feature-gates="" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363162 2610 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363165 2610 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 14:55:36.367209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363168 2610 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363172 2610 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363175 2610 flags.go:64] FLAG: --healthz-port="10248" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363178 2610 flags.go:64] FLAG: --help="false" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363181 2610 flags.go:64] FLAG: --hostname-override="ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363184 2610 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363187 2610 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363190 2610 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363193 2610 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363196 2610 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363200 2610 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363203 2610 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363205 2610 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363208 2610 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363211 2610 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363214 2610 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363217 2610 flags.go:64] FLAG: --kube-reserved="" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363220 2610 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363223 2610 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363226 2610 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363228 2610 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363231 2610 flags.go:64] FLAG: --lock-file="" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363234 2610 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363236 2610 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 14:55:36.367829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363239 2610 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363245 2610 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363247 2610 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363250 2610 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363253 2610 flags.go:64] FLAG: --logging-format="text" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363256 2610 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363260 2610 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363263 2610 flags.go:64] FLAG: --manifest-url="" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363266 2610 flags.go:64] FLAG: --manifest-url-header="" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363272 2610 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363275 2610 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363279 2610 flags.go:64] FLAG: --max-pods="110" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363282 2610 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363285 2610 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363288 2610 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363291 2610 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363294 2610 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363297 2610 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363300 2610 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363307 2610 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363310 2610 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363313 2610 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363316 2610 flags.go:64] FLAG: --pod-cidr="" Apr 21 14:55:36.368409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363318 2610 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363324 2610 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363327 2610 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363330 2610 flags.go:64] FLAG: --pods-per-core="0" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363333 2610 flags.go:64] FLAG: --port="10250" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363336 2610 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363338 2610 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0dee0ab7ddcc58b49" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363341 2610 flags.go:64] FLAG: --qos-reserved="" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363344 2610 flags.go:64] FLAG: --read-only-port="10255" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363347 2610 flags.go:64] FLAG: --register-node="true" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363350 2610 flags.go:64] FLAG: --register-schedulable="true" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363352 2610 flags.go:64] FLAG: --register-with-taints="" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363356 2610 flags.go:64] FLAG: --registry-burst="10" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363358 2610 flags.go:64] FLAG: --registry-qps="5" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363362 2610 flags.go:64] FLAG: --reserved-cpus="" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363364 2610 flags.go:64] FLAG: --reserved-memory="" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363368 2610 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363371 2610 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363374 2610 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363378 2610 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363381 2610 flags.go:64] FLAG: --runonce="false" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363384 2610 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363387 2610 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363390 2610 flags.go:64] FLAG: --seccomp-default="false" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363392 2610 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363395 2610 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 14:55:36.369010 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363398 2610 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363401 2610 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363404 2610 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363407 2610 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363410 2610 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363413 2610 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363416 2610 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363418 2610 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363421 2610 flags.go:64] FLAG: --system-cgroups="" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363424 2610 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363429 2610 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363432 2610 flags.go:64] FLAG: --tls-cert-file="" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363435 2610 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363438 2610 flags.go:64] FLAG: --tls-min-version="" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363441 2610 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363443 2610 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363446 2610 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363449 2610 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363452 2610 flags.go:64] FLAG: --v="2" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363456 2610 flags.go:64] FLAG: --version="false" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363461 2610 flags.go:64] FLAG: --vmodule="" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363465 2610 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.363468 2610 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363562 2610 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363566 2610 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:36.369668 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363584 2610 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363587 2610 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363590 2610 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363593 2610 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363596 2610 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363598 2610 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363601 2610 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363604 2610 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363606 2610 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363609 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363611 2610 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363614 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363617 2610 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363620 2610 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363623 2610 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363627 2610 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363629 2610 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363632 2610 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363635 2610 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:36.370283 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363638 2610 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363642 2610 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363645 2610 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363648 2610 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363651 2610 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363654 2610 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363657 2610 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363659 2610 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363664 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363667 2610 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363669 2610 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363672 2610 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363675 2610 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363679 2610 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363682 2610 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363685 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363687 2610 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363690 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363692 2610 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:36.370790 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363695 2610 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363698 2610 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363700 2610 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363703 2610 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363705 2610 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363708 2610 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363711 2610 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363713 2610 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363716 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363718 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363721 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363724 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363727 2610 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363729 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363731 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363734 2610 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363737 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363739 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363742 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363744 2610 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:36.371253 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363747 2610 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363750 2610 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363753 2610 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363755 2610 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363757 2610 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363760 2610 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363764 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363766 2610 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363769 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363771 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363774 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363776 2610 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363779 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363781 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363783 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363786 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363788 2610 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363791 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363793 2610 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:36.371749 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363796 2610 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363798 2610 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363801 2610 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363803 2610 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363806 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363808 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.363811 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.364593 2610 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.371329 2610 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.371344 2610 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371391 2610 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371396 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371400 2610 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371403 2610 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371406 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371409 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:36.372251 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371412 2610 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371415 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371417 2610 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371420 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371423 2610 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371426 2610 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371429 2610 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371432 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371434 2610 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371437 2610 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371440 2610 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371442 2610 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371445 2610 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371448 2610 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371450 2610 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371453 2610 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371455 2610 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371458 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371460 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371463 2610 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:36.372677 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371465 2610 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371468 2610 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371470 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371473 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371476 2610 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371480 2610 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371482 2610 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371485 2610 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371487 2610 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371490 2610 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371493 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371495 2610 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371498 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371501 2610 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371503 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371506 2610 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371508 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371510 2610 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371513 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371516 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:36.373161 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371518 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371521 2610 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371523 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371526 2610 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371528 2610 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371531 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371533 2610 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371535 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371538 2610 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371540 2610 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371543 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371545 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371547 2610 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371550 2610 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371552 2610 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371555 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371558 2610 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371560 2610 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371564 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371567 2610 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:36.373657 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371569 2610 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371587 2610 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371590 2610 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371594 2610 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371598 2610 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371601 2610 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371604 2610 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371606 2610 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371608 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371611 2610 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371613 2610 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371616 2610 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371618 2610 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371621 2610 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371623 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371626 2610 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371630 2610 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371635 2610 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371638 2610 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:36.374133 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371640 2610 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.371646 2610 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371744 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371749 2610 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371752 2610 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371756 2610 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371759 2610 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371762 2610 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371765 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371768 2610 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371771 2610 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371774 2610 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371777 2610 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371780 2610 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371782 2610 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371785 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:36.374626 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371787 2610 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371790 2610 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371792 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371795 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371797 2610 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371800 2610 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371803 2610 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371805 2610 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371807 2610 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371810 2610 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371812 2610 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371815 2610 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371817 2610 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371819 2610 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371822 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371824 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371827 2610 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371829 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371832 2610 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371834 2610 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:36.375026 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371836 2610 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371839 2610 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371841 2610 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371845 2610 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371847 2610 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371850 2610 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371853 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371856 2610 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371859 2610 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371862 2610 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371864 2610 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371867 2610 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371869 2610 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371871 2610 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371874 2610 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371876 2610 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371879 2610 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371881 2610 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371885 2610 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:36.375521 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371888 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371891 2610 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371894 2610 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371897 2610 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371899 2610 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371902 2610 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371904 2610 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371906 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371909 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371911 2610 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371914 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371917 2610 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371919 2610 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371922 2610 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371925 2610 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371927 2610 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371929 2610 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371932 2610 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371934 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371936 2610 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:36.376006 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371939 2610 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371941 2610 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371943 2610 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371946 2610 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371949 2610 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371951 2610 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371955 2610 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371958 2610 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371961 2610 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371964 2610 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371966 2610 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371968 2610 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:36.371971 2610 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.371976 2610 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:36.376480 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.372694 2610 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 14:55:36.378043 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.378029 2610 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 14:55:36.380360 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.380349 2610 server.go:1019] "Starting client certificate rotation" Apr 21 14:55:36.380461 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.380447 2610 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:55:36.380497 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.380482 2610 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:55:36.409499 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.409479 2610 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:55:36.414402 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.414376 2610 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:55:36.434431 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.434412 2610 log.go:25] "Validated CRI v1 runtime API" Apr 21 14:55:36.440941 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.440926 2610 log.go:25] "Validated CRI v1 image API" Apr 21 14:55:36.442086 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.442062 2610 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 14:55:36.443350 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.443334 2610 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:55:36.446198 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.446177 2610 fs.go:135] Filesystem UUIDs: map[0153def3-97e2-4ee4-8847-52cc7fc4d9cb:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d3cb6339-96f4-4b82-ae2f-3b0457db5c62:/dev/nvme0n1p4] Apr 21 14:55:36.446249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.446198 2610 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 14:55:36.451200 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.451097 2610 manager.go:217] Machine: {Timestamp:2026-04-21 14:55:36.449935891 +0000 UTC m=+0.433827933 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099380 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec298d626928e96d310a7e2c6a7b949f SystemUUID:ec298d62-6928-e96d-310a-7e2c6a7b949f BootID:a56921f3-ac83-47e3-a0d1-13bebb00c865 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ef:d9:7c:eb:85 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ef:d9:7c:eb:85 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:9e:f8:82:93:da Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 14:55:36.451614 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.451604 2610 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 14:55:36.451701 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.451689 2610 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 14:55:36.453287 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.453265 2610 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 14:55:36.453423 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.453290 2610 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-133.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 14:55:36.453469 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.453435 2610 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 14:55:36.453469 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.453443 2610 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 14:55:36.453469 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.453455 2610 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:55:36.454265 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.454255 2610 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:55:36.455413 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.455403 2610 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:55:36.455713 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.455703 2610 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 14:55:36.458496 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.458485 2610 kubelet.go:491] "Attempting to sync node with API server" Apr 21 14:55:36.458533 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.458502 2610 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 14:55:36.458533 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.458514 2610 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 14:55:36.458533 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.458523 2610 kubelet.go:397] "Adding apiserver pod source" Apr 21 14:55:36.458533 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.458532 2610 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 14:55:36.459760 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.459745 2610 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:55:36.459801 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.459773 2610 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:55:36.463695 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.463680 2610 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 14:55:36.464917 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.464904 2610 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 14:55:36.467500 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467487 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 14:55:36.467544 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467511 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 14:55:36.467544 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467521 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 14:55:36.467544 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467527 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 14:55:36.467544 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467533 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 14:55:36.467544 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467539 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 14:55:36.467544 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467545 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 14:55:36.467727 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467552 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 14:55:36.467727 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467563 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 14:55:36.467727 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467569 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 14:55:36.467727 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467601 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 14:55:36.467727 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.467611 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 14:55:36.468555 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.468546 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 14:55:36.468599 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.468555 2610 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 14:55:36.472305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.472292 2610 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 14:55:36.472357 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.472343 2610 server.go:1295] "Started kubelet" Apr 21 14:55:36.472482 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.472441 2610 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 14:55:36.472638 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.472585 2610 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 14:55:36.472673 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.472664 2610 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 14:55:36.473300 ip-10-0-129-133 systemd[1]: Started Kubernetes Kubelet. Apr 21 14:55:36.475857 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.475836 2610 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-133.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 14:55:36.476007 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.475977 2610 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 14:55:36.476103 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.476031 2610 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-133.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 14:55:36.476103 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.476043 2610 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 14:55:36.478263 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.478235 2610 server.go:317] "Adding debug handlers to kubelet server" Apr 21 14:55:36.481533 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.481516 2610 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 14:55:36.481979 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.480648 2610 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-133.ec2.internal.18a86707dd43db03 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-133.ec2.internal,UID:ip-10-0-129-133.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-133.ec2.internal,},FirstTimestamp:2026-04-21 14:55:36.472304387 +0000 UTC m=+0.456196426,LastTimestamp:2026-04-21 14:55:36.472304387 +0000 UTC m=+0.456196426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-133.ec2.internal,}" Apr 21 14:55:36.482060 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.482046 2610 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 14:55:36.482801 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.482783 2610 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 14:55:36.482801 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.482805 2610 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 14:55:36.482947 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.482837 2610 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 14:55:36.482947 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.482848 2610 factory.go:55] Registering systemd factory Apr 21 14:55:36.482947 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.482855 2610 factory.go:223] Registration of the systemd container factory successfully Apr 21 14:55:36.483112 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.483099 2610 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 14:55:36.483162 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.483117 2610 factory.go:153] Registering CRI-O factory Apr 21 14:55:36.483162 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.483130 2610 factory.go:223] Registration of the crio container factory successfully Apr 21 14:55:36.483162 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.483151 2610 factory.go:103] Registering Raw factory Apr 21 14:55:36.483162 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.483164 2610 manager.go:1196] Started watching for new ooms in manager Apr 21 14:55:36.483348 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.483191 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:36.483348 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.483201 2610 reconstruct.go:97] "Volume reconstruction finished" Apr 21 14:55:36.483348 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.483229 2610 reconciler.go:26] "Reconciler: start to sync state" Apr 21 14:55:36.483502 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.483491 2610 manager.go:319] Starting recovery of all containers Apr 21 14:55:36.484321 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.484294 2610 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 14:55:36.491864 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.491702 2610 manager.go:324] Recovery completed Apr 21 14:55:36.497132 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.497116 2610 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:36.497999 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.497980 2610 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-spq4l" Apr 21 14:55:36.498457 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.498436 2610 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-133.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 14:55:36.499685 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.499662 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:36.499772 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.499705 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:36.499772 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.499731 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:36.500203 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.500189 2610 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 14:55:36.500260 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.500212 2610 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 14:55:36.500260 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.500226 2610 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:55:36.501774 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.501702 2610 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-133.ec2.internal.18a86707dee5c2cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-133.ec2.internal,UID:ip-10-0-129-133.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-133.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-133.ec2.internal,},FirstTimestamp:2026-04-21 14:55:36.499692239 +0000 UTC m=+0.483584279,LastTimestamp:2026-04-21 14:55:36.499692239 +0000 UTC m=+0.483584279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-133.ec2.internal,}" Apr 21 14:55:36.501861 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.501791 2610 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 14:55:36.502341 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.502328 2610 policy_none.go:49] "None policy: Start" Apr 21 14:55:36.502380 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.502346 2610 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 14:55:36.502380 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.502357 2610 state_mem.go:35] "Initializing new in-memory state store" Apr 21 14:55:36.503539 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.503523 2610 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-spq4l" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.543874 2610 manager.go:341] "Starting Device Plugin manager" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.543902 2610 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.543911 2610 server.go:85] "Starting device plugin registration server" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.544159 2610 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.544170 2610 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.544265 2610 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.544337 2610 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.544345 2610 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.544920 2610 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 14:55:36.561440 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.544957 2610 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:36.618754 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.618682 2610 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 14:55:36.620026 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.620009 2610 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 14:55:36.620078 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.620043 2610 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 14:55:36.620078 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.620066 2610 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 14:55:36.620078 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.620074 2610 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 14:55:36.620174 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.620114 2610 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 14:55:36.623520 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.623494 2610 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:36.644795 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.644781 2610 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:36.645872 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.645854 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:36.645946 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.645883 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:36.645946 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.645895 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:36.645946 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.645915 2610 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.654484 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.654465 2610 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.654545 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.654491 2610 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-133.ec2.internal\": node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:36.696232 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.696201 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:36.720316 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.720291 2610 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal"] Apr 21 14:55:36.720387 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.720360 2610 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:36.722142 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.722120 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:36.722232 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.722153 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:36.722232 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.722163 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:36.724416 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.724403 2610 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:36.724598 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.724567 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.724640 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.724615 2610 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:36.726670 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.726652 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:36.726670 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.726669 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:36.726830 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.726681 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:36.726830 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.726691 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:36.726830 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.726696 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:36.726830 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.726706 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:36.728887 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.728871 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.728964 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.728901 2610 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:36.729624 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.729605 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:36.729699 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.729636 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:36.729699 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.729647 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:36.764245 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.764222 2610 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-133.ec2.internal\" not found" node="ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.768711 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.768693 2610 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-133.ec2.internal\" not found" node="ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.784963 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.784945 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bec3f852eac0a721026594b796d3fd1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal\" (UID: \"bec3f852eac0a721026594b796d3fd1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.785069 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.784975 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bec3f852eac0a721026594b796d3fd1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal\" (UID: \"bec3f852eac0a721026594b796d3fd1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.785069 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.785005 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4074072fa96faab5923784feb5b91477-config\") pod \"kube-apiserver-proxy-ip-10-0-129-133.ec2.internal\" (UID: \"4074072fa96faab5923784feb5b91477\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.796466 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.796446 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:36.886109 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.886014 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bec3f852eac0a721026594b796d3fd1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal\" (UID: \"bec3f852eac0a721026594b796d3fd1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.886109 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.886064 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bec3f852eac0a721026594b796d3fd1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal\" (UID: \"bec3f852eac0a721026594b796d3fd1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.886109 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.886107 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4074072fa96faab5923784feb5b91477-config\") pod \"kube-apiserver-proxy-ip-10-0-129-133.ec2.internal\" (UID: \"4074072fa96faab5923784feb5b91477\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.886316 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.886147 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4074072fa96faab5923784feb5b91477-config\") pod \"kube-apiserver-proxy-ip-10-0-129-133.ec2.internal\" (UID: \"4074072fa96faab5923784feb5b91477\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.886316 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.886148 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bec3f852eac0a721026594b796d3fd1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal\" (UID: \"bec3f852eac0a721026594b796d3fd1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.886316 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:36.886192 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bec3f852eac0a721026594b796d3fd1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal\" (UID: \"bec3f852eac0a721026594b796d3fd1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:36.897117 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.897095 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:36.997945 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:36.997911 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:37.066141 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.066110 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:37.070650 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.070633 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" Apr 21 14:55:37.098649 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.098620 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:37.199269 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.199200 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:37.299712 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.299677 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:37.378887 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.378863 2610 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 14:55:37.379398 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.378998 2610 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:55:37.400391 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.400366 2610 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-133.ec2.internal\" not found" Apr 21 14:55:37.456921 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.456856 2610 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:37.457066 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.457052 2610 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:37.459547 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.459534 2610 apiserver.go:52] "Watching apiserver" Apr 21 14:55:37.468446 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.468424 2610 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 14:55:37.470939 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.470858 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-xpgf5","openshift-ovn-kubernetes/ovnkube-node-fqshv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw","openshift-cluster-node-tuning-operator/tuned-jr2bx","openshift-dns/node-resolver-n8qc9","openshift-image-registry/node-ca-z6hwp","openshift-multus/network-metrics-daemon-mtdkf","kube-system/konnectivity-agent-qgsvb","openshift-multus/multus-additional-cni-plugins-kjdc5","openshift-multus/multus-whzlb","openshift-network-diagnostics/network-check-target-gphbk"] Apr 21 14:55:37.474545 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.474525 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.476851 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.476831 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.478741 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.478693 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2znd6\"" Apr 21 14:55:37.478741 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.478717 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 14:55:37.478971 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.478956 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.479228 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.479209 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:55:37.480419 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.480404 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 14:55:37.481243 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.480806 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 14:55:37.481243 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.480868 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 14:55:37.481243 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.480891 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 14:55:37.481243 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.480995 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.481243 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481027 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 14:55:37.481243 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481036 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 14:55:37.481243 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481145 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 14:55:37.481540 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481337 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 14:55:37.481540 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481440 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 14:55:37.481540 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481517 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vqbxz\"" Apr 21 14:55:37.481683 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481623 2610 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 14:55:37.481683 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481630 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gw772\"" Apr 21 14:55:37.481941 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.481924 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 14:55:37.482700 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.482686 2610 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" Apr 21 14:55:37.483172 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.483156 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:37.483247 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.483218 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:37.486039 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.486013 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 14:55:37.486131 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.486079 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:37.488339 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.488311 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 14:55:37.488504 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.488490 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.489588 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489551 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovnkube-script-lib\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.489695 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489601 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-cni-bin\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.489695 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489672 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-sys-fs\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.489788 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489701 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/002f040e-530f-43cc-92d7-0789dd3ec88e-konnectivity-ca\") pod \"konnectivity-agent-qgsvb\" (UID: \"002f040e-530f-43cc-92d7-0789dd3ec88e\") " pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:37.489788 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489727 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c5fa724-5c79-4789-8467-fe3456892c7d-iptables-alerter-script\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.489788 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489747 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-cni-netd\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.489788 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489769 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-device-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489792 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489822 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-systemd-units\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489848 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489875 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-ovn\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489913 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-node-log\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489952 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-etc-selinux\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.489986 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkvqs\" (UniqueName: \"kubernetes.io/projected/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-kube-api-access-fkvqs\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490028 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c5fa724-5c79-4789-8467-fe3456892c7d-host-slash\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490048 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-slash\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490070 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-etc-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490093 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-run-ovn-kubernetes\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490095 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490114 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovnkube-config\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490153 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-registration-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490197 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-tmp-dir\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490228 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/002f040e-530f-43cc-92d7-0789dd3ec88e-agent-certs\") pod \"konnectivity-agent-qgsvb\" (UID: \"002f040e-530f-43cc-92d7-0789dd3ec88e\") " pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490254 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-kubelet\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490280 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490310 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-socket-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490321 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n5rp8\"" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490339 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8v4w\" (UniqueName: \"kubernetes.io/projected/efd7538a-c71f-4a2a-99a4-44675f8eab54-kube-api-access-l8v4w\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490368 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490370 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct74t\" (UniqueName: \"kubernetes.io/projected/9c5fa724-5c79-4789-8467-fe3456892c7d-kube-api-access-ct74t\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490420 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-run-netns\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490442 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-env-overrides\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490476 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpfgl\" (UniqueName: \"kubernetes.io/projected/cff88e04-38fb-4737-b9d3-2f25f36cf06c-kube-api-access-dpfgl\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490500 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.490566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490519 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovn-node-metrics-cert\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.491332 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490542 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-hosts-file\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.491332 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490562 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msfzc\" (UniqueName: \"kubernetes.io/projected/9b064625-50f7-4c6a-be44-9aed34a00b26-kube-api-access-msfzc\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:37.491332 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490599 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-systemd\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.491332 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490663 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-var-lib-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.491332 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490688 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-log-socket\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.491332 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.490841 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.493216 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.493198 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-shmt9\"" Apr 21 14:55:37.493304 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.493261 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.495891 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.495872 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.499363 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499121 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 14:55:37.499363 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499151 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 14:55:37.499363 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499197 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 14:55:37.499363 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499241 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 14:55:37.499363 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499275 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:37.499363 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499242 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 14:55:37.499363 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.499341 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:37.499650 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499560 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:55:37.499650 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499633 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 14:55:37.499745 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499663 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rn6jn\"" Apr 21 14:55:37.499745 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499665 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 14:55:37.499745 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499712 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 14:55:37.499864 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499717 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-584db\"" Apr 21 14:55:37.499864 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499809 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 14:55:37.499960 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499945 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-79rjb\"" Apr 21 14:55:37.499960 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.499957 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zsff7\"" Apr 21 14:55:37.505476 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.505451 2610 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 14:50:36 +0000 UTC" deadline="2027-12-18 02:00:16.972238088 +0000 UTC" Apr 21 14:55:37.505476 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.505473 2610 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14531h4m39.466767482s" Apr 21 14:55:37.506335 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.506319 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 14:55:37.516594 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.516563 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal"] Apr 21 14:55:37.516970 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.516954 2610 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:55:37.517049 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.517017 2610 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" Apr 21 14:55:37.519290 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.519272 2610 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:55:37.528599 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.528584 2610 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:55:37.528768 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.528754 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal"] Apr 21 14:55:37.550831 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.550812 2610 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-prrkr" Apr 21 14:55:37.561651 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.561630 2610 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-prrkr" Apr 21 14:55:37.584273 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.584214 2610 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 14:55:37.590851 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590831 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-ovn\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.590957 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590857 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-node-log\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.590957 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590873 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c5fa724-5c79-4789-8467-fe3456892c7d-host-slash\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.590957 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590900 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-slash\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.590957 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590917 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-run-ovn-kubernetes\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.590957 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590946 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-ovn\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590960 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-run-ovn-kubernetes\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590967 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-slash\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590953 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-node-log\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590959 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c5fa724-5c79-4789-8467-fe3456892c7d-host-slash\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.590983 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovnkube-config\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591031 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-tmp-dir\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591057 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/002f040e-530f-43cc-92d7-0789dd3ec88e-agent-certs\") pod \"konnectivity-agent-qgsvb\" (UID: \"002f040e-530f-43cc-92d7-0789dd3ec88e\") " pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591079 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-system-cni-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591094 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-os-release\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.591187 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591131 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-kubelet\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.591609 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591227 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-kubelet\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.591609 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591234 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-socket-dir-parent\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.591609 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591262 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-daemon-config\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.591609 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591278 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-multus-certs\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.591609 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591293 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-host\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.591609 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591326 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpfgl\" (UniqueName: \"kubernetes.io/projected/cff88e04-38fb-4737-b9d3-2f25f36cf06c-kube-api-access-dpfgl\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.591609 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591356 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-tmp-dir\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.591900 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591407 2610 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 14:55:37.591900 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591416 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.591900 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591884 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.592039 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591933 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72237d81-3f9e-4b04-a299-0acb0dd6604c-host\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.592039 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.591942 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovnkube-config\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.592039 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592013 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-tuned\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.592195 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592076 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovn-node-metrics-cert\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.592195 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592146 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-hosts-file\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.592294 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592220 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msfzc\" (UniqueName: \"kubernetes.io/projected/9b064625-50f7-4c6a-be44-9aed34a00b26-kube-api-access-msfzc\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:37.592294 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592229 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-hosts-file\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.592294 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592267 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-system-cni-dir\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.592434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592303 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc5cf\" (UniqueName: \"kubernetes.io/projected/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-kube-api-access-zc5cf\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.592434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592344 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-modprobe-d\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.592434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592379 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-systemd\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.592434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592408 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-log-socket\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.592648 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592455 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-etc-kubernetes\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.592648 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592505 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-var-lib-kubelet\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.592648 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592508 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-systemd\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.592790 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592661 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-log-socket\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.592836 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592808 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-cni-bin\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.592551 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-cni-bin\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593076 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-cni-netd\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593110 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-device-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593143 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs5cd\" (UniqueName: \"kubernetes.io/projected/cf009889-4a60-4449-8425-a8c15708e69e-kube-api-access-rs5cd\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593168 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-cni-binary-copy\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593199 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-kubelet\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593228 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysconfig\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593255 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-kubernetes\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593283 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-sys\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593335 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-systemd-units\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593367 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593397 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-etc-selinux\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593428 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkvqs\" (UniqueName: \"kubernetes.io/projected/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-kube-api-access-fkvqs\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593460 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-os-release\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593495 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-lib-modules\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593518 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb8d83e4-e43f-4a6f-8423-a2feab619d63-tmp\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.593659 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593590 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-etc-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593623 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-registration-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593656 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593689 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593714 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-socket-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593744 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8v4w\" (UniqueName: \"kubernetes.io/projected/efd7538a-c71f-4a2a-99a4-44675f8eab54-kube-api-access-l8v4w\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593783 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-netns\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593814 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ct74t\" (UniqueName: \"kubernetes.io/projected/9c5fa724-5c79-4789-8467-fe3456892c7d-kube-api-access-ct74t\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593843 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-run-netns\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593868 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-env-overrides\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593900 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593931 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593965 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-cni-multus\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.593998 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-var-lib-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594027 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovnkube-script-lib\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594064 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-cnibin\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.594521 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594090 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594120 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-cni-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594148 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-k8s-cni-cncf-io\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594178 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-hostroot\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594210 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-sys-fs\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594236 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/002f040e-530f-43cc-92d7-0789dd3ec88e-konnectivity-ca\") pod \"konnectivity-agent-qgsvb\" (UID: \"002f040e-530f-43cc-92d7-0789dd3ec88e\") " pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594268 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72237d81-3f9e-4b04-a299-0acb0dd6604c-serviceca\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594297 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-cni-bin\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594327 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-conf-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594354 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysctl-d\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594377 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysctl-conf\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594430 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-run\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594444 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-socket-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594462 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c5fa724-5c79-4789-8467-fe3456892c7d-iptables-alerter-script\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594498 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594529 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-device-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594755 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-cni-netd\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.595366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594530 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntrw\" (UniqueName: \"kubernetes.io/projected/72237d81-3f9e-4b04-a299-0acb0dd6604c-kube-api-access-gntrw\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594920 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-cnibin\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594961 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-systemd\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.594991 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvljq\" (UniqueName: \"kubernetes.io/projected/bb8d83e4-e43f-4a6f-8423-a2feab619d63-kube-api-access-mvljq\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595030 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595136 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-run-netns\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595326 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-systemd-units\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.595426 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595491 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-env-overrides\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.595527 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:38.095483729 +0000 UTC m=+2.079375776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595564 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-var-lib-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595646 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c5fa724-5c79-4789-8467-fe3456892c7d-iptables-alerter-script\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595673 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-etc-selinux\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595699 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-sys-fs\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.595737 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-run-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.596088 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.596154 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/efd7538a-c71f-4a2a-99a4-44675f8eab54-registration-dir\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.596231 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.596207 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cff88e04-38fb-4737-b9d3-2f25f36cf06c-etc-openvswitch\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.597013 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.596257 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/002f040e-530f-43cc-92d7-0789dd3ec88e-konnectivity-ca\") pod \"konnectivity-agent-qgsvb\" (UID: \"002f040e-530f-43cc-92d7-0789dd3ec88e\") " pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:37.597098 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.597056 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovnkube-script-lib\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.598021 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.597995 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cff88e04-38fb-4737-b9d3-2f25f36cf06c-ovn-node-metrics-cert\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.598115 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.598100 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/002f040e-530f-43cc-92d7-0789dd3ec88e-agent-certs\") pod \"konnectivity-agent-qgsvb\" (UID: \"002f040e-530f-43cc-92d7-0789dd3ec88e\") " pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:37.599805 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.599786 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpfgl\" (UniqueName: \"kubernetes.io/projected/cff88e04-38fb-4737-b9d3-2f25f36cf06c-kube-api-access-dpfgl\") pod \"ovnkube-node-fqshv\" (UID: \"cff88e04-38fb-4737-b9d3-2f25f36cf06c\") " pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.605710 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.605590 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msfzc\" (UniqueName: \"kubernetes.io/projected/9b064625-50f7-4c6a-be44-9aed34a00b26-kube-api-access-msfzc\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:37.609647 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.609622 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkvqs\" (UniqueName: \"kubernetes.io/projected/b48f3832-4ecd-46ba-bde8-35a4180bf3ca-kube-api-access-fkvqs\") pod \"node-resolver-n8qc9\" (UID: \"b48f3832-4ecd-46ba-bde8-35a4180bf3ca\") " pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.609954 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.609929 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8v4w\" (UniqueName: \"kubernetes.io/projected/efd7538a-c71f-4a2a-99a4-44675f8eab54-kube-api-access-l8v4w\") pod \"aws-ebs-csi-driver-node-fzdmw\" (UID: \"efd7538a-c71f-4a2a-99a4-44675f8eab54\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.610394 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.610375 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct74t\" (UniqueName: \"kubernetes.io/projected/9c5fa724-5c79-4789-8467-fe3456892c7d-kube-api-access-ct74t\") pod \"iptables-alerter-xpgf5\" (UID: \"9c5fa724-5c79-4789-8467-fe3456892c7d\") " pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.667449 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.667402 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4074072fa96faab5923784feb5b91477.slice/crio-012ba2c08cf24ef9f489ee122f675849570f96ea71690205c24f6505016c84d5 WatchSource:0}: Error finding container 012ba2c08cf24ef9f489ee122f675849570f96ea71690205c24f6505016c84d5: Status 404 returned error can't find the container with id 012ba2c08cf24ef9f489ee122f675849570f96ea71690205c24f6505016c84d5 Apr 21 14:55:37.668616 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.668593 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbec3f852eac0a721026594b796d3fd1a.slice/crio-a057f762f307e96c6f58c8662bf869baef8729bf73caa8c6c9475ca9069abae3 WatchSource:0}: Error finding container a057f762f307e96c6f58c8662bf869baef8729bf73caa8c6c9475ca9069abae3: Status 404 returned error can't find the container with id a057f762f307e96c6f58c8662bf869baef8729bf73caa8c6c9475ca9069abae3 Apr 21 14:55:37.672714 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.672698 2610 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:55:37.696083 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696049 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-netns\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696083 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696090 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.696249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696114 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.696249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696139 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-cni-multus\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696156 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-netns\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696167 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-cnibin\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.696249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696193 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-cni-multus\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696203 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-cnibin\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.696249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696200 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696270 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696305 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-cni-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696323 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-k8s-cni-cncf-io\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696338 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-hostroot\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696356 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72237d81-3f9e-4b04-a299-0acb0dd6604c-serviceca\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696372 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-cni-bin\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696387 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-conf-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696394 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-k8s-cni-cncf-io\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696411 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysctl-d\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696422 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-hostroot\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696435 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysctl-conf\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696458 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-cni-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696472 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-run\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696475 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-conf-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696515 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gntrw\" (UniqueName: \"kubernetes.io/projected/72237d81-3f9e-4b04-a299-0acb0dd6604c-kube-api-access-gntrw\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696541 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-cnibin\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.696541 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696549 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysctl-conf\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696557 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-run\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696565 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-systemd\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696591 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysctl-d\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696617 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvljq\" (UniqueName: \"kubernetes.io/projected/bb8d83e4-e43f-4a6f-8423-a2feab619d63-kube-api-access-mvljq\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696630 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-cnibin\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696643 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696675 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-system-cni-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696722 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696715 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696645 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-systemd\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696777 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-system-cni-dir\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696795 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-cni-bin\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696815 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72237d81-3f9e-4b04-a299-0acb0dd6604c-serviceca\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696843 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-os-release\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696871 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-socket-dir-parent\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696887 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-daemon-config\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696901 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-multus-certs\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696917 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-host\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696950 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72237d81-3f9e-4b04-a299-0acb0dd6604c-host\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696952 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-os-release\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696965 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-tuned\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696964 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-run-multus-certs\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696983 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-host\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.696984 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-system-cni-dir\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697009 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc5cf\" (UniqueName: \"kubernetes.io/projected/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-kube-api-access-zc5cf\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697026 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-modprobe-d\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697028 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72237d81-3f9e-4b04-a299-0acb0dd6604c-host\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697054 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-etc-kubernetes\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697082 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-var-lib-kubelet\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697087 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-modprobe-d\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697119 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-system-cni-dir\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697132 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs5cd\" (UniqueName: \"kubernetes.io/projected/cf009889-4a60-4449-8425-a8c15708e69e-kube-api-access-rs5cd\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697147 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-etc-kubernetes\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697157 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-cni-binary-copy\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697190 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-kubelet\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.697986 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697218 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysconfig\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697243 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-kubernetes\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697268 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-sys\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697293 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-os-release\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697299 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-host-var-lib-kubelet\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697151 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-var-lib-kubelet\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697323 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-lib-modules\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697361 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-kubernetes\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697391 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb8d83e4-e43f-4a6f-8423-a2feab619d63-tmp\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697394 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-sysconfig\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697441 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-sys\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697449 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697528 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-daemon-config\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697538 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb8d83e4-e43f-4a6f-8423-a2feab619d63-lib-modules\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697632 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-cni-binary-copy\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697637 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-multus-socket-dir-parent\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697394 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf009889-4a60-4449-8425-a8c15708e69e-os-release\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.698463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.697874 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cf009889-4a60-4449-8425-a8c15708e69e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.699307 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.699289 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb8d83e4-e43f-4a6f-8423-a2feab619d63-etc-tuned\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.699372 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.699358 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb8d83e4-e43f-4a6f-8423-a2feab619d63-tmp\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.709309 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.709251 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:37.709309 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.709275 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:37.709309 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.709287 2610 projected.go:194] Error preparing data for projected volume kube-api-access-jmmhw for pod openshift-network-diagnostics/network-check-target-gphbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:37.709510 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:37.709350 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw podName:3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:38.20933187 +0000 UTC m=+2.193223915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jmmhw" (UniqueName: "kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw") pod "network-check-target-gphbk" (UID: "3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:37.710664 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.710641 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvljq\" (UniqueName: \"kubernetes.io/projected/bb8d83e4-e43f-4a6f-8423-a2feab619d63-kube-api-access-mvljq\") pod \"tuned-jr2bx\" (UID: \"bb8d83e4-e43f-4a6f-8423-a2feab619d63\") " pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.711915 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.711891 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc5cf\" (UniqueName: \"kubernetes.io/projected/1ecf557b-547c-4875-bb8d-a80ee4cd1f74-kube-api-access-zc5cf\") pod \"multus-whzlb\" (UID: \"1ecf557b-547c-4875-bb8d-a80ee4cd1f74\") " pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.712148 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.712124 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntrw\" (UniqueName: \"kubernetes.io/projected/72237d81-3f9e-4b04-a299-0acb0dd6604c-kube-api-access-gntrw\") pod \"node-ca-z6hwp\" (UID: \"72237d81-3f9e-4b04-a299-0acb0dd6604c\") " pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.712403 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.712388 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs5cd\" (UniqueName: \"kubernetes.io/projected/cf009889-4a60-4449-8425-a8c15708e69e-kube-api-access-rs5cd\") pod \"multus-additional-cni-plugins-kjdc5\" (UID: \"cf009889-4a60-4449-8425-a8c15708e69e\") " pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.799956 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.799910 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xpgf5" Apr 21 14:55:37.805858 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.805833 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c5fa724_5c79_4789_8467_fe3456892c7d.slice/crio-beaa987a006102d98b6b76554fbae7f0c183c6a45560fdf50bf8ec765fcf6541 WatchSource:0}: Error finding container beaa987a006102d98b6b76554fbae7f0c183c6a45560fdf50bf8ec765fcf6541: Status 404 returned error can't find the container with id beaa987a006102d98b6b76554fbae7f0c183c6a45560fdf50bf8ec765fcf6541 Apr 21 14:55:37.814208 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.814184 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:55:37.819785 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.819760 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff88e04_38fb_4737_b9d3_2f25f36cf06c.slice/crio-3caa1cf3b317d7aa498eb94b61be14270df492b85b5407c0a8bb95c91cee0915 WatchSource:0}: Error finding container 3caa1cf3b317d7aa498eb94b61be14270df492b85b5407c0a8bb95c91cee0915: Status 404 returned error can't find the container with id 3caa1cf3b317d7aa498eb94b61be14270df492b85b5407c0a8bb95c91cee0915 Apr 21 14:55:37.838867 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.838847 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" Apr 21 14:55:37.844345 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.844323 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd7538a_c71f_4a2a_99a4_44675f8eab54.slice/crio-2389318ac17e876e6f860217556e878e550e10da3029c855a2821792bf0ba916 WatchSource:0}: Error finding container 2389318ac17e876e6f860217556e878e550e10da3029c855a2821792bf0ba916: Status 404 returned error can't find the container with id 2389318ac17e876e6f860217556e878e550e10da3029c855a2821792bf0ba916 Apr 21 14:55:37.857601 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.857584 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n8qc9" Apr 21 14:55:37.863123 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.863103 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:37.864285 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.864266 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb48f3832_4ecd_46ba_bde8_35a4180bf3ca.slice/crio-63cd079e27ff8fdf31ce0d0cb530afc97d925ccdb536a5cc6d4197605d579048 WatchSource:0}: Error finding container 63cd079e27ff8fdf31ce0d0cb530afc97d925ccdb536a5cc6d4197605d579048: Status 404 returned error can't find the container with id 63cd079e27ff8fdf31ce0d0cb530afc97d925ccdb536a5cc6d4197605d579048 Apr 21 14:55:37.870735 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.870714 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod002f040e_530f_43cc_92d7_0789dd3ec88e.slice/crio-af27329e665afad66adbd5ad3c108bfa400715259d90694d84eea28135cc926b WatchSource:0}: Error finding container af27329e665afad66adbd5ad3c108bfa400715259d90694d84eea28135cc926b: Status 404 returned error can't find the container with id af27329e665afad66adbd5ad3c108bfa400715259d90694d84eea28135cc926b Apr 21 14:55:37.878267 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.878249 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" Apr 21 14:55:37.884268 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.884242 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf009889_4a60_4449_8425_a8c15708e69e.slice/crio-015f2ddaa732d97fc318ee0fa4d9ff55534c2f3ff58b348d6f13376fe9cb12a6 WatchSource:0}: Error finding container 015f2ddaa732d97fc318ee0fa4d9ff55534c2f3ff58b348d6f13376fe9cb12a6: Status 404 returned error can't find the container with id 015f2ddaa732d97fc318ee0fa4d9ff55534c2f3ff58b348d6f13376fe9cb12a6 Apr 21 14:55:37.895279 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.895263 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-whzlb" Apr 21 14:55:37.901477 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.901456 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ecf557b_547c_4875_bb8d_a80ee4cd1f74.slice/crio-c0ce4b8ec89270c66f7369bddabfe44aa342f6a8fe49b23f1e04afd908bc7cdc WatchSource:0}: Error finding container c0ce4b8ec89270c66f7369bddabfe44aa342f6a8fe49b23f1e04afd908bc7cdc: Status 404 returned error can't find the container with id c0ce4b8ec89270c66f7369bddabfe44aa342f6a8fe49b23f1e04afd908bc7cdc Apr 21 14:55:37.918439 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.918418 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" Apr 21 14:55:37.922951 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.922933 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z6hwp" Apr 21 14:55:37.924431 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.924407 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8d83e4_e43f_4a6f_8423_a2feab619d63.slice/crio-719956a07a94672b902ec11b0260fcce84fa44be4b09549ab156729ae9cdacd1 WatchSource:0}: Error finding container 719956a07a94672b902ec11b0260fcce84fa44be4b09549ab156729ae9cdacd1: Status 404 returned error can't find the container with id 719956a07a94672b902ec11b0260fcce84fa44be4b09549ab156729ae9cdacd1 Apr 21 14:55:37.925794 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:37.925767 2610 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:37.929088 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:55:37.929070 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72237d81_3f9e_4b04_a299_0acb0dd6604c.slice/crio-3cdaf4b253c93eae782bd26e3a61cfbc47e7dec0270ec7141827e0ce3ebc7919 WatchSource:0}: Error finding container 3cdaf4b253c93eae782bd26e3a61cfbc47e7dec0270ec7141827e0ce3ebc7919: Status 404 returned error can't find the container with id 3cdaf4b253c93eae782bd26e3a61cfbc47e7dec0270ec7141827e0ce3ebc7919 Apr 21 14:55:38.099840 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.099740 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:38.100001 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:38.099897 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:38.100001 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:38.099980 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:39.099959937 +0000 UTC m=+3.083851979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:38.301599 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.301540 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:38.301774 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:38.301737 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:38.301774 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:38.301757 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:38.301774 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:38.301770 2610 projected.go:194] Error preparing data for projected volume kube-api-access-jmmhw for pod openshift-network-diagnostics/network-check-target-gphbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:38.301921 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:38.301842 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw podName:3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:39.30181977 +0000 UTC m=+3.285711802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jmmhw" (UniqueName: "kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw") pod "network-check-target-gphbk" (UID: "3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:38.514392 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.514321 2610 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:38.563070 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.563028 2610 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:50:37 +0000 UTC" deadline="2028-02-06 00:25:48.326473706 +0000 UTC" Apr 21 14:55:38.563070 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.563068 2610 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15729h30m9.763409931s" Apr 21 14:55:38.621704 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.621520 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:38.621704 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:38.621685 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:38.646991 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.646921 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" event={"ID":"cf009889-4a60-4449-8425-a8c15708e69e","Type":"ContainerStarted","Data":"015f2ddaa732d97fc318ee0fa4d9ff55534c2f3ff58b348d6f13376fe9cb12a6"} Apr 21 14:55:38.655901 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.655839 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qgsvb" event={"ID":"002f040e-530f-43cc-92d7-0789dd3ec88e","Type":"ContainerStarted","Data":"af27329e665afad66adbd5ad3c108bfa400715259d90694d84eea28135cc926b"} Apr 21 14:55:38.669391 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.669326 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n8qc9" event={"ID":"b48f3832-4ecd-46ba-bde8-35a4180bf3ca","Type":"ContainerStarted","Data":"63cd079e27ff8fdf31ce0d0cb530afc97d925ccdb536a5cc6d4197605d579048"} Apr 21 14:55:38.683594 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.683546 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" event={"ID":"efd7538a-c71f-4a2a-99a4-44675f8eab54","Type":"ContainerStarted","Data":"2389318ac17e876e6f860217556e878e550e10da3029c855a2821792bf0ba916"} Apr 21 14:55:38.707565 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.707530 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xpgf5" event={"ID":"9c5fa724-5c79-4789-8467-fe3456892c7d","Type":"ContainerStarted","Data":"beaa987a006102d98b6b76554fbae7f0c183c6a45560fdf50bf8ec765fcf6541"} Apr 21 14:55:38.713082 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.713052 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" event={"ID":"bec3f852eac0a721026594b796d3fd1a","Type":"ContainerStarted","Data":"a057f762f307e96c6f58c8662bf869baef8729bf73caa8c6c9475ca9069abae3"} Apr 21 14:55:38.727255 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.727223 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" event={"ID":"bb8d83e4-e43f-4a6f-8423-a2feab619d63","Type":"ContainerStarted","Data":"719956a07a94672b902ec11b0260fcce84fa44be4b09549ab156729ae9cdacd1"} Apr 21 14:55:38.749067 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.749030 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"3caa1cf3b317d7aa498eb94b61be14270df492b85b5407c0a8bb95c91cee0915"} Apr 21 14:55:38.785108 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.785019 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" event={"ID":"4074072fa96faab5923784feb5b91477","Type":"ContainerStarted","Data":"012ba2c08cf24ef9f489ee122f675849570f96ea71690205c24f6505016c84d5"} Apr 21 14:55:38.811621 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.811566 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z6hwp" event={"ID":"72237d81-3f9e-4b04-a299-0acb0dd6604c","Type":"ContainerStarted","Data":"3cdaf4b253c93eae782bd26e3a61cfbc47e7dec0270ec7141827e0ce3ebc7919"} Apr 21 14:55:38.827594 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:38.827541 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-whzlb" event={"ID":"1ecf557b-547c-4875-bb8d-a80ee4cd1f74","Type":"ContainerStarted","Data":"c0ce4b8ec89270c66f7369bddabfe44aa342f6a8fe49b23f1e04afd908bc7cdc"} Apr 21 14:55:39.107992 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:39.107913 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:39.108149 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:39.108077 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:39.108149 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:39.108139 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:41.10812032 +0000 UTC m=+5.092012349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:39.309800 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:39.309526 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:39.309998 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:39.309979 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:39.310078 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:39.310006 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:39.310078 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:39.310018 2610 projected.go:194] Error preparing data for projected volume kube-api-access-jmmhw for pod openshift-network-diagnostics/network-check-target-gphbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:39.310078 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:39.310077 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw podName:3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:41.310059617 +0000 UTC m=+5.293951657 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jmmhw" (UniqueName: "kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw") pod "network-check-target-gphbk" (UID: "3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:39.563854 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:39.563767 2610 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:50:37 +0000 UTC" deadline="2027-10-19 01:41:52.104183875 +0000 UTC" Apr 21 14:55:39.563854 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:39.563807 2610 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13090h46m12.540381488s" Apr 21 14:55:39.621359 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:39.620863 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:39.621359 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:39.620992 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:40.620420 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:40.620386 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:40.620871 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:40.620522 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:41.125067 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:41.125029 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:41.125253 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:41.125217 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:41.125314 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:41.125282 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:45.125263572 +0000 UTC m=+9.109155622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:41.326648 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:41.326602 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:41.326832 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:41.326716 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:41.326832 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:41.326745 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:41.326832 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:41.326760 2610 projected.go:194] Error preparing data for projected volume kube-api-access-jmmhw for pod openshift-network-diagnostics/network-check-target-gphbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:41.326832 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:41.326811 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw podName:3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:45.326796941 +0000 UTC m=+9.310688966 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jmmhw" (UniqueName: "kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw") pod "network-check-target-gphbk" (UID: "3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:41.620286 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:41.620250 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:41.620451 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:41.620384 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:42.620495 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:42.620457 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:42.621002 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:42.620569 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:43.621118 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:43.621087 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:43.621567 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:43.621204 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:44.622207 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:44.621777 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:44.622207 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:44.621916 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:45.161063 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:45.161028 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:45.161244 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:45.161205 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:45.161309 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:45.161283 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:53.161262243 +0000 UTC m=+17.145154272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:45.363215 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:45.363175 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:45.363396 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:45.363343 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:45.363396 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:45.363363 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:45.363396 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:45.363377 2610 projected.go:194] Error preparing data for projected volume kube-api-access-jmmhw for pod openshift-network-diagnostics/network-check-target-gphbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:45.363554 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:45.363436 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw podName:3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:53.363419044 +0000 UTC m=+17.347311071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jmmhw" (UniqueName: "kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw") pod "network-check-target-gphbk" (UID: "3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:45.620700 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:45.620620 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:45.620870 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:45.620747 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:46.621163 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:46.621121 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:46.621587 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:46.621245 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:47.621273 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:47.621227 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:47.621697 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:47.621367 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:48.621075 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:48.621034 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:48.621215 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:48.621158 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:49.620408 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:49.620373 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:49.620894 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:49.620493 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:50.620811 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:50.620779 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:50.621176 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:50.620894 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:51.620944 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:51.620869 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:51.621365 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:51.620974 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:52.620965 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:52.620931 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:52.621422 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:52.621047 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:53.219552 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:53.219514 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:53.219717 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:53.219699 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:53.219779 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:53.219767 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:09.219746257 +0000 UTC m=+33.203638306 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:53.421148 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:53.421104 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:53.421366 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:53.421273 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:53.421366 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:53.421290 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:53.421366 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:53.421302 2610 projected.go:194] Error preparing data for projected volume kube-api-access-jmmhw for pod openshift-network-diagnostics/network-check-target-gphbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:53.421366 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:53.421364 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw podName:3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:09.421345116 +0000 UTC m=+33.405237146 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jmmhw" (UniqueName: "kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw") pod "network-check-target-gphbk" (UID: "3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:53.621277 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:53.621240 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:53.621764 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:53.621375 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:54.620719 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.620688 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:54.620954 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:54.620806 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:54.637212 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.637181 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-h5rsz"] Apr 21 14:55:54.658100 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.658074 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:54.658278 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:54.658160 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:55:54.730203 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.730171 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1649b770-32f3-4c98-9e33-13d820fcd898-dbus\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:54.730343 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.730258 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:54.730343 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.730285 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1649b770-32f3-4c98-9e33-13d820fcd898-kubelet-config\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:54.831543 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.831504 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:54.831744 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.831559 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1649b770-32f3-4c98-9e33-13d820fcd898-kubelet-config\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:54.831744 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.831614 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1649b770-32f3-4c98-9e33-13d820fcd898-dbus\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:54.831844 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.831815 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1649b770-32f3-4c98-9e33-13d820fcd898-dbus\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:54.831917 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:54.831899 2610 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:54.832043 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:54.831961 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret podName:1649b770-32f3-4c98-9e33-13d820fcd898 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:55.33194536 +0000 UTC m=+19.315837385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret") pod "global-pull-secret-syncer-h5rsz" (UID: "1649b770-32f3-4c98-9e33-13d820fcd898") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:54.832043 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:54.831977 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1649b770-32f3-4c98-9e33-13d820fcd898-kubelet-config\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:55.335690 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:55.335653 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:55.335845 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:55.335813 2610 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:55.335904 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:55.335881 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret podName:1649b770-32f3-4c98-9e33-13d820fcd898 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:56.335866421 +0000 UTC m=+20.319758452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret") pod "global-pull-secret-syncer-h5rsz" (UID: "1649b770-32f3-4c98-9e33-13d820fcd898") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:55.620972 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:55.620881 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:55.621119 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:55.621004 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:56.343156 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:56.343119 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:56.343502 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:56.343262 2610 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:56.343502 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:56.343338 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret podName:1649b770-32f3-4c98-9e33-13d820fcd898 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:58.343323036 +0000 UTC m=+22.327215062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret") pod "global-pull-secret-syncer-h5rsz" (UID: "1649b770-32f3-4c98-9e33-13d820fcd898") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:56.621220 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:56.621188 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:56.621361 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:56.621288 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:56.621404 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:56.621363 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:56.621461 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:56.621443 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:55:57.621109 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.620298 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:57.621109 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:57.620694 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:57.868812 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.868642 2610 generic.go:358] "Generic (PLEG): container finished" podID="bec3f852eac0a721026594b796d3fd1a" containerID="704eef9014e48513ed22086e53742892717d5530331cc038cc390b849526a319" exitCode=0 Apr 21 14:55:57.868973 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.868734 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" event={"ID":"bec3f852eac0a721026594b796d3fd1a","Type":"ContainerDied","Data":"704eef9014e48513ed22086e53742892717d5530331cc038cc390b849526a319"} Apr 21 14:55:57.870160 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.870132 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" event={"ID":"bb8d83e4-e43f-4a6f-8423-a2feab619d63","Type":"ContainerStarted","Data":"dc9f78c29d9acb7219c54b4ad04205ed5fc6ce4fa0aa116cdd2dd05875294024"} Apr 21 14:55:57.872767 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.872691 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"1787a283647161f317ebdcbdd646177edb9c01b06c33f29b61715da2c5b65a17"} Apr 21 14:55:57.872767 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.872713 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"c79d3f7bd31d12433f938970d4f4d930aa63186d3bdb16da998dc3b1c84f5eef"} Apr 21 14:55:57.872767 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.872727 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"bfcd42a3363861fe03c24027603ad8e89614a46ca17f6b4ae99412d50b3617e9"} Apr 21 14:55:57.872767 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.872736 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"278298036ef9289172971a56db591777d017f160f6c4227e7bf49e5d1d5361d7"} Apr 21 14:55:57.872767 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.872747 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"ea8df74247cc3d5b223950c7e92a31f26599ad18e53a76e26c8671765dbf209d"} Apr 21 14:55:57.872767 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.872759 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"4ed3350743ce7d0270e0831236bed5943def45c3da557fb64d3d3de8b7d491ac"} Apr 21 14:55:57.873910 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.873870 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" event={"ID":"4074072fa96faab5923784feb5b91477","Type":"ContainerStarted","Data":"9b91c2f5b88b615287a312014ffcbc1ffdcd761db1ea09ecf8daef030fa2e038"} Apr 21 14:55:57.875066 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.875048 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z6hwp" event={"ID":"72237d81-3f9e-4b04-a299-0acb0dd6604c","Type":"ContainerStarted","Data":"76c1bd5b65897e290508d8d6614698840bc737c62f2385dbab14ea792ab758e8"} Apr 21 14:55:57.876237 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.876207 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-whzlb" event={"ID":"1ecf557b-547c-4875-bb8d-a80ee4cd1f74","Type":"ContainerStarted","Data":"5cc86eb0d7e7d9d77333e8de05a454bfbcb4a07791bcf723ec66d82124345ddf"} Apr 21 14:55:57.877419 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.877398 2610 generic.go:358] "Generic (PLEG): container finished" podID="cf009889-4a60-4449-8425-a8c15708e69e" containerID="3161b6ec24b32616d64c0d1c939b1a6f938e5dc8e65e3776721a02c8f27d5c0d" exitCode=0 Apr 21 14:55:57.877477 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.877422 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" event={"ID":"cf009889-4a60-4449-8425-a8c15708e69e","Type":"ContainerDied","Data":"3161b6ec24b32616d64c0d1c939b1a6f938e5dc8e65e3776721a02c8f27d5c0d"} Apr 21 14:55:57.878779 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.878763 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qgsvb" event={"ID":"002f040e-530f-43cc-92d7-0789dd3ec88e","Type":"ContainerStarted","Data":"b02735243838186683519f197d0e73a0dba29f1b20283c7ea0c4d5240d1c394f"} Apr 21 14:55:57.880105 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.880077 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n8qc9" event={"ID":"b48f3832-4ecd-46ba-bde8-35a4180bf3ca","Type":"ContainerStarted","Data":"9f487cf5dd1825b80f270cea3b1e71dcaa149fda9593a7a7bdcc0e0a815ea769"} Apr 21 14:55:57.881308 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.881291 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" event={"ID":"efd7538a-c71f-4a2a-99a4-44675f8eab54","Type":"ContainerStarted","Data":"c2138f27f430e20456b90764e2b795b36c107ab336db42114d8a3f04ae47750a"} Apr 21 14:55:57.917767 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.917740 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:57.918332 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.918312 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:57.963861 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.963814 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qgsvb" podStartSLOduration=3.176698336 podStartE2EDuration="21.963802817s" podCreationTimestamp="2026-04-21 14:55:36 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.872319669 +0000 UTC m=+1.856211701" lastFinishedPulling="2026-04-21 14:55:56.65942415 +0000 UTC m=+20.643316182" observedRunningTime="2026-04-21 14:55:57.963409652 +0000 UTC m=+21.947301699" watchObservedRunningTime="2026-04-21 14:55:57.963802817 +0000 UTC m=+21.947694865" Apr 21 14:55:57.994292 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:57.994248 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-whzlb" podStartSLOduration=3.216047017 podStartE2EDuration="21.994230023s" podCreationTimestamp="2026-04-21 14:55:36 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.902758464 +0000 UTC m=+1.886650489" lastFinishedPulling="2026-04-21 14:55:56.68094146 +0000 UTC m=+20.664833495" observedRunningTime="2026-04-21 14:55:57.993969986 +0000 UTC m=+21.977862034" watchObservedRunningTime="2026-04-21 14:55:57.994230023 +0000 UTC m=+21.978122074" Apr 21 14:55:58.059676 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.058069 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-n8qc9" podStartSLOduration=3.245812629 podStartE2EDuration="22.05805175s" podCreationTimestamp="2026-04-21 14:55:36 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.867727816 +0000 UTC m=+1.851619845" lastFinishedPulling="2026-04-21 14:55:56.67996694 +0000 UTC m=+20.663858966" observedRunningTime="2026-04-21 14:55:58.057463814 +0000 UTC m=+22.041355859" watchObservedRunningTime="2026-04-21 14:55:58.05805175 +0000 UTC m=+22.041943799" Apr 21 14:55:58.059676 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.058194 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z6hwp" podStartSLOduration=2.743401732 podStartE2EDuration="21.058185548s" podCreationTimestamp="2026-04-21 14:55:37 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.930528833 +0000 UTC m=+1.914420859" lastFinishedPulling="2026-04-21 14:55:56.245312636 +0000 UTC m=+20.229204675" observedRunningTime="2026-04-21 14:55:58.027476625 +0000 UTC m=+22.011368673" watchObservedRunningTime="2026-04-21 14:55:58.058185548 +0000 UTC m=+22.042077596" Apr 21 14:55:58.114749 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.114722 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:58.115133 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.115117 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qgsvb" Apr 21 14:55:58.150032 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.149939 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-133.ec2.internal" podStartSLOduration=21.149926222 podStartE2EDuration="21.149926222s" podCreationTimestamp="2026-04-21 14:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:55:58.149243183 +0000 UTC m=+22.133135233" watchObservedRunningTime="2026-04-21 14:55:58.149926222 +0000 UTC m=+22.133818347" Apr 21 14:55:58.150142 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.150060 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jr2bx" podStartSLOduration=2.641231889 podStartE2EDuration="21.150056205s" podCreationTimestamp="2026-04-21 14:55:37 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.926086057 +0000 UTC m=+1.909978083" lastFinishedPulling="2026-04-21 14:55:56.434910352 +0000 UTC m=+20.418802399" observedRunningTime="2026-04-21 14:55:58.130754064 +0000 UTC m=+22.114646112" watchObservedRunningTime="2026-04-21 14:55:58.150056205 +0000 UTC m=+22.133948254" Apr 21 14:55:58.363039 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.362987 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:58.363213 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:58.363110 2610 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:58.363213 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:58.363194 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret podName:1649b770-32f3-4c98-9e33-13d820fcd898 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:02.36317302 +0000 UTC m=+26.347065062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret") pod "global-pull-secret-syncer-h5rsz" (UID: "1649b770-32f3-4c98-9e33-13d820fcd898") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:58.620634 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.620482 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:55:58.620741 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:58.620659 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:55:58.620741 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.620690 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:55:58.620811 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:58.620781 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:55:58.756518 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.756496 2610 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 14:55:58.885640 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.885543 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" event={"ID":"efd7538a-c71f-4a2a-99a4-44675f8eab54","Type":"ContainerStarted","Data":"6f7ba5ab39639c1f732e3a75bd385500332217f49b85c097e016c6f4bc1f5237"} Apr 21 14:55:58.886981 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.886955 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xpgf5" event={"ID":"9c5fa724-5c79-4789-8467-fe3456892c7d","Type":"ContainerStarted","Data":"462c6c0256250487e1ff2b30ce908d2702a664553618a83ca8295cea692ae646"} Apr 21 14:55:58.888733 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.888707 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" event={"ID":"bec3f852eac0a721026594b796d3fd1a","Type":"ContainerStarted","Data":"42eab129b8be15c85c2035160f66c3dc4d1be150f3d2df9ee0b442a4a2c7c1d6"} Apr 21 14:55:58.901833 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:58.901793 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xpgf5" podStartSLOduration=4.206986474 podStartE2EDuration="22.901781201s" podCreationTimestamp="2026-04-21 14:55:36 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.807447561 +0000 UTC m=+1.791339586" lastFinishedPulling="2026-04-21 14:55:56.502242275 +0000 UTC m=+20.486134313" observedRunningTime="2026-04-21 14:55:58.901555749 +0000 UTC m=+22.885447796" watchObservedRunningTime="2026-04-21 14:55:58.901781201 +0000 UTC m=+22.885673249" Apr 21 14:55:59.556352 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:59.556239 2610 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T14:55:58.756512813Z","UUID":"c98b7eef-2b41-4531-9916-134b76ca5f84","Handler":null,"Name":"","Endpoint":""} Apr 21 14:55:59.558222 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:59.558197 2610 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 14:55:59.558365 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:59.558231 2610 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 14:55:59.620779 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:59.620751 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:55:59.620963 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:55:59.620865 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:55:59.893622 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:55:59.893589 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"9ca0c1d9360b2d113b85e0863e20cd36fcb64c8bf37e3c7c88843aa1c11525b8"} Apr 21 14:56:00.621135 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:00.620921 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:00.621332 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:00.620922 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:00.621332 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:00.621227 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:56:00.621332 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:00.621320 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:56:00.897022 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:00.896933 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" event={"ID":"efd7538a-c71f-4a2a-99a4-44675f8eab54","Type":"ContainerStarted","Data":"9c26265fc95609aa412128bbb4be7bf36fc960fc607d82d71b9f254e92d8a19d"} Apr 21 14:56:00.913850 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:00.913794 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-133.ec2.internal" podStartSLOduration=23.913776939 podStartE2EDuration="23.913776939s" podCreationTimestamp="2026-04-21 14:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:55:58.914040254 +0000 UTC m=+22.897932304" watchObservedRunningTime="2026-04-21 14:56:00.913776939 +0000 UTC m=+24.897668989" Apr 21 14:56:00.913997 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:00.913913 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fzdmw" podStartSLOduration=3.005010058 podStartE2EDuration="24.913907306s" podCreationTimestamp="2026-04-21 14:55:36 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.845714652 +0000 UTC m=+1.829606677" lastFinishedPulling="2026-04-21 14:55:59.754611899 +0000 UTC m=+23.738503925" observedRunningTime="2026-04-21 14:56:00.913281985 +0000 UTC m=+24.897174032" watchObservedRunningTime="2026-04-21 14:56:00.913907306 +0000 UTC m=+24.897799356" Apr 21 14:56:01.620936 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:01.620901 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:01.621121 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:01.621027 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:56:02.399047 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.398797 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:02.399047 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:02.398944 2610 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:02.399047 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:02.399046 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret podName:1649b770-32f3-4c98-9e33-13d820fcd898 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:10.399032944 +0000 UTC m=+34.382924969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret") pod "global-pull-secret-syncer-h5rsz" (UID: "1649b770-32f3-4c98-9e33-13d820fcd898") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:02.621155 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.621119 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:02.621309 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.621121 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:02.621309 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:02.621223 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:56:02.621416 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:02.621319 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:56:02.902005 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.901967 2610 generic.go:358] "Generic (PLEG): container finished" podID="cf009889-4a60-4449-8425-a8c15708e69e" containerID="5f767c05702494dc0823c27c67665d5adcbdfe7afd8b8311234f122fb17919cf" exitCode=0 Apr 21 14:56:02.902213 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.902058 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" event={"ID":"cf009889-4a60-4449-8425-a8c15708e69e","Type":"ContainerDied","Data":"5f767c05702494dc0823c27c67665d5adcbdfe7afd8b8311234f122fb17919cf"} Apr 21 14:56:02.905266 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.905236 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" event={"ID":"cff88e04-38fb-4737-b9d3-2f25f36cf06c","Type":"ContainerStarted","Data":"fed0210c1e34f2a61acf500b46d22901d5672f9d3da5f5a85f2db74ce577e1f9"} Apr 21 14:56:02.905643 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.905549 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:56:02.905643 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.905597 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:56:02.905643 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.905611 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:56:02.920388 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.920369 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:56:02.920498 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.920426 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:56:02.948452 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:02.948410 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" podStartSLOduration=7.678643133 podStartE2EDuration="26.948398888s" podCreationTimestamp="2026-04-21 14:55:36 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.821246627 +0000 UTC m=+1.805138653" lastFinishedPulling="2026-04-21 14:55:57.091002382 +0000 UTC m=+21.074894408" observedRunningTime="2026-04-21 14:56:02.947563795 +0000 UTC m=+26.931455844" watchObservedRunningTime="2026-04-21 14:56:02.948398888 +0000 UTC m=+26.932290937" Apr 21 14:56:03.620982 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:03.620950 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:03.621502 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:03.621069 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:56:03.919549 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:03.919514 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h5rsz"] Apr 21 14:56:03.919687 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:03.919660 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:03.920029 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:03.919757 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:56:03.922068 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:03.922047 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gphbk"] Apr 21 14:56:03.922145 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:03.922130 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:03.922211 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:03.922196 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:56:03.924845 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:03.924826 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mtdkf"] Apr 21 14:56:03.924939 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:03.924923 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:03.925055 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:03.925019 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:56:04.911625 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:04.911589 2610 generic.go:358] "Generic (PLEG): container finished" podID="cf009889-4a60-4449-8425-a8c15708e69e" containerID="181a151a8294b39b1715fafd85958835177ddf2abab4ea0b690df0a7babdad78" exitCode=0 Apr 21 14:56:04.912123 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:04.911668 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" event={"ID":"cf009889-4a60-4449-8425-a8c15708e69e","Type":"ContainerDied","Data":"181a151a8294b39b1715fafd85958835177ddf2abab4ea0b690df0a7babdad78"} Apr 21 14:56:05.620670 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:05.620628 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:05.620811 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:05.620675 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:05.620811 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:05.620748 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:56:05.620912 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:05.620825 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:56:05.620912 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:05.620859 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:05.620973 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:05.620945 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:56:05.915326 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:05.915299 2610 generic.go:358] "Generic (PLEG): container finished" podID="cf009889-4a60-4449-8425-a8c15708e69e" containerID="0431ef487546c7835b0d74459ce45235bd31b53e3bf67916e16c58b3874f3e6e" exitCode=0 Apr 21 14:56:05.915639 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:05.915354 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" event={"ID":"cf009889-4a60-4449-8425-a8c15708e69e","Type":"ContainerDied","Data":"0431ef487546c7835b0d74459ce45235bd31b53e3bf67916e16c58b3874f3e6e"} Apr 21 14:56:07.620409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:07.620365 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:07.621011 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:07.620365 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:07.621011 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:07.620380 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:07.621011 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:07.620483 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:56:07.621011 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:07.620677 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:56:07.621011 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:07.620730 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:56:09.254117 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.254084 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:09.254704 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.254238 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:09.254704 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.254311 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:41.25429008 +0000 UTC m=+65.238182109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:09.456261 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.456213 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:09.456441 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.456370 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:09.456441 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.456390 2610 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:09.456441 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.456402 2610 projected.go:194] Error preparing data for projected volume kube-api-access-jmmhw for pod openshift-network-diagnostics/network-check-target-gphbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:09.456609 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.456469 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw podName:3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:41.456450149 +0000 UTC m=+65.440342181 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jmmhw" (UniqueName: "kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw") pod "network-check-target-gphbk" (UID: "3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:09.620763 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.620727 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:09.620954 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.620898 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:09.620954 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.620914 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gphbk" podUID="3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1" Apr 21 14:56:09.621073 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.620996 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h5rsz" podUID="1649b770-32f3-4c98-9e33-13d820fcd898" Apr 21 14:56:09.621073 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.621048 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:09.621174 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:09.621113 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:56:09.793980 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.793899 2610 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-133.ec2.internal" event="NodeReady" Apr 21 14:56:09.794130 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.794064 2610 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 14:56:09.826001 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.825957 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2"] Apr 21 14:56:09.857052 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.857026 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67b4567bdb-m7dt5"] Apr 21 14:56:09.857221 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.857186 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:09.859105 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.859075 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 14:56:09.859233 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.859157 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-jbj88\"" Apr 21 14:56:09.859281 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.859076 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 14:56:09.884742 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.884714 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2"] Apr 21 14:56:09.884871 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.884756 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67b4567bdb-m7dt5"] Apr 21 14:56:09.884871 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.884761 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:09.884871 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.884768 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6vmnw"] Apr 21 14:56:09.886686 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.886657 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-47f52\"" Apr 21 14:56:09.886798 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.886694 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 14:56:09.886798 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.886698 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 14:56:09.886798 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.886658 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 14:56:09.899212 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.899192 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6vmnw"] Apr 21 14:56:09.899346 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.899329 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:09.901834 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.901806 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 14:56:09.901963 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.901848 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 14:56:09.901963 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.901799 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 14:56:09.902635 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.902614 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7xfsn\"" Apr 21 14:56:09.903506 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.903487 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 14:56:09.947809 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.947777 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mhp6p"] Apr 21 14:56:09.959691 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.959658 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8033b471-ca39-425f-9cbb-cf56b370a5a2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:09.959860 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.959751 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:09.966915 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.966886 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mhp6p"] Apr 21 14:56:09.967045 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.967029 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:09.968874 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.968854 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5sqkw\"" Apr 21 14:56:09.969007 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.968885 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 14:56:09.969007 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:09.968963 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 14:56:10.060694 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060610 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-trusted-ca\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.060694 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060657 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4f6b\" (UniqueName: \"kubernetes.io/projected/17ba6101-b1f6-412d-b361-2276f610226b-kube-api-access-b4f6b\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.060914 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060701 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8033b471-ca39-425f-9cbb-cf56b370a5a2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:10.060914 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060757 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:10.060914 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060785 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.060914 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060805 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hvr\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-kube-api-access-v4hvr\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.060914 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060828 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17ba6101-b1f6-412d-b361-2276f610226b-config-volume\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.060914 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060900 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-image-registry-private-configuration\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.060914 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.060911 2610 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060926 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ac89af4-5925-4a52-a694-31a92b841ed6-ca-trust-extracted\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.060948 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.060987 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert podName:8033b471-ca39-425f-9cbb-cf56b370a5a2 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:10.560967834 +0000 UTC m=+34.544859878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-c5gp2" (UID: "8033b471-ca39-425f-9cbb-cf56b370a5a2") : secret "networking-console-plugin-cert" not found Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.061058 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-installation-pull-secrets\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.061084 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-certificates\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.061107 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-bound-sa-token\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.061127 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.061149 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snsn6\" (UniqueName: \"kubernetes.io/projected/40ebea87-6126-42fa-bcf7-027f7fbce419-kube-api-access-snsn6\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:10.061248 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.061170 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17ba6101-b1f6-412d-b361-2276f610226b-tmp-dir\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.061719 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.061531 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8033b471-ca39-425f-9cbb-cf56b370a5a2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:10.162351 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162309 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-trusted-ca\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.162519 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162362 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4f6b\" (UniqueName: \"kubernetes.io/projected/17ba6101-b1f6-412d-b361-2276f610226b-kube-api-access-b4f6b\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.162519 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162438 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.162519 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162466 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hvr\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-kube-api-access-v4hvr\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.162519 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162492 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17ba6101-b1f6-412d-b361-2276f610226b-config-volume\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162519 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-image-registry-private-configuration\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162545 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ac89af4-5925-4a52-a694-31a92b841ed6-ca-trust-extracted\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162565 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.162565 2610 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.162616 2610 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67b4567bdb-m7dt5: secret "image-registry-tls" not found Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162638 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-installation-pull-secrets\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162656 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-certificates\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.162678 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls podName:5ac89af4-5925-4a52-a694-31a92b841ed6 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:10.662657201 +0000 UTC m=+34.646549249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls") pod "image-registry-67b4567bdb-m7dt5" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6") : secret "image-registry-tls" not found Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162715 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-bound-sa-token\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162745 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:10.162777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162772 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snsn6\" (UniqueName: \"kubernetes.io/projected/40ebea87-6126-42fa-bcf7-027f7fbce419-kube-api-access-snsn6\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:10.163288 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.162796 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17ba6101-b1f6-412d-b361-2276f610226b-tmp-dir\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.163288 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.163121 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17ba6101-b1f6-412d-b361-2276f610226b-tmp-dir\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.163288 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.163146 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-certificates\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.163426 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.163295 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-trusted-ca\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.163426 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.163386 2610 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:10.163527 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.163429 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls podName:17ba6101-b1f6-412d-b361-2276f610226b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:10.663415078 +0000 UTC m=+34.647307108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls") pod "dns-default-mhp6p" (UID: "17ba6101-b1f6-412d-b361-2276f610226b") : secret "dns-default-metrics-tls" not found Apr 21 14:56:10.163527 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.163435 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ac89af4-5925-4a52-a694-31a92b841ed6-ca-trust-extracted\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.163663 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.163603 2610 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:10.163663 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.163649 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert podName:40ebea87-6126-42fa-bcf7-027f7fbce419 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:10.663633961 +0000 UTC m=+34.647526011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert") pod "ingress-canary-6vmnw" (UID: "40ebea87-6126-42fa-bcf7-027f7fbce419") : secret "canary-serving-cert" not found Apr 21 14:56:10.164088 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.164068 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17ba6101-b1f6-412d-b361-2276f610226b-config-volume\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.167877 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.167858 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-installation-pull-secrets\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.167967 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.167888 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-image-registry-private-configuration\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.172591 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.172540 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hvr\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-kube-api-access-v4hvr\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.172829 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.172809 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snsn6\" (UniqueName: \"kubernetes.io/projected/40ebea87-6126-42fa-bcf7-027f7fbce419-kube-api-access-snsn6\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:10.172879 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.172830 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-bound-sa-token\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.184625 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.184554 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4f6b\" (UniqueName: \"kubernetes.io/projected/17ba6101-b1f6-412d-b361-2276f610226b-kube-api-access-b4f6b\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.465706 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.465615 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:10.466399 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.465767 2610 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:10.466399 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.465855 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret podName:1649b770-32f3-4c98-9e33-13d820fcd898 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:26.465838146 +0000 UTC m=+50.449730177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret") pod "global-pull-secret-syncer-h5rsz" (UID: "1649b770-32f3-4c98-9e33-13d820fcd898") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:10.567090 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.567047 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:10.567270 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.567233 2610 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:56:10.567338 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.567326 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert podName:8033b471-ca39-425f-9cbb-cf56b370a5a2 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:11.567305264 +0000 UTC m=+35.551197298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-c5gp2" (UID: "8033b471-ca39-425f-9cbb-cf56b370a5a2") : secret "networking-console-plugin-cert" not found Apr 21 14:56:10.668385 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.668343 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:10.668558 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.668455 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:10.668558 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.668466 2610 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:10.668558 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:10.668489 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:10.668558 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.668542 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert podName:40ebea87-6126-42fa-bcf7-027f7fbce419 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:11.668525207 +0000 UTC m=+35.652417252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert") pod "ingress-canary-6vmnw" (UID: "40ebea87-6126-42fa-bcf7-027f7fbce419") : secret "canary-serving-cert" not found Apr 21 14:56:10.668780 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.668620 2610 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:10.668780 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.668624 2610 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:10.668780 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.668644 2610 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67b4567bdb-m7dt5: secret "image-registry-tls" not found Apr 21 14:56:10.668780 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.668675 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls podName:17ba6101-b1f6-412d-b361-2276f610226b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:11.66866009 +0000 UTC m=+35.652552118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls") pod "dns-default-mhp6p" (UID: "17ba6101-b1f6-412d-b361-2276f610226b") : secret "dns-default-metrics-tls" not found Apr 21 14:56:10.668780 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:10.668699 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls podName:5ac89af4-5925-4a52-a694-31a92b841ed6 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:11.668682041 +0000 UTC m=+35.652574074 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls") pod "image-registry-67b4567bdb-m7dt5" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6") : secret "image-registry-tls" not found Apr 21 14:56:11.575982 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.575945 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:11.576341 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.576082 2610 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:56:11.576341 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.576147 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert podName:8033b471-ca39-425f-9cbb-cf56b370a5a2 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:13.576133314 +0000 UTC m=+37.560025339 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-c5gp2" (UID: "8033b471-ca39-425f-9cbb-cf56b370a5a2") : secret "networking-console-plugin-cert" not found Apr 21 14:56:11.620875 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.620834 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:11.620875 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.620862 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:11.620875 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.620884 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:11.623030 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.623006 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:56:11.623134 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.623042 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:56:11.623462 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.623444 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 14:56:11.623568 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.623492 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cb4hj\"" Apr 21 14:56:11.623568 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.623502 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sdz22\"" Apr 21 14:56:11.623568 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.623539 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:56:11.676843 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.676807 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:11.677008 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.676848 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:11.677008 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.676953 2610 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:11.677008 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.676956 2610 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:11.677153 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.677012 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls podName:17ba6101-b1f6-412d-b361-2276f610226b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:13.676994791 +0000 UTC m=+37.660886820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls") pod "dns-default-mhp6p" (UID: "17ba6101-b1f6-412d-b361-2276f610226b") : secret "dns-default-metrics-tls" not found Apr 21 14:56:11.677153 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.677020 2610 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67b4567bdb-m7dt5: secret "image-registry-tls" not found Apr 21 14:56:11.677153 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:11.677037 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:11.677153 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.677063 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls podName:5ac89af4-5925-4a52-a694-31a92b841ed6 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:13.677052841 +0000 UTC m=+37.660944873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls") pod "image-registry-67b4567bdb-m7dt5" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6") : secret "image-registry-tls" not found Apr 21 14:56:11.677153 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.677134 2610 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:11.677322 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:11.677180 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert podName:40ebea87-6126-42fa-bcf7-027f7fbce419 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:13.677168193 +0000 UTC m=+37.661060219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert") pod "ingress-canary-6vmnw" (UID: "40ebea87-6126-42fa-bcf7-027f7fbce419") : secret "canary-serving-cert" not found Apr 21 14:56:12.932967 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:12.932732 2610 generic.go:358] "Generic (PLEG): container finished" podID="cf009889-4a60-4449-8425-a8c15708e69e" containerID="5cad92ac8a0145a0ceef527583ef3a4bdcae6c7ff2028acc897093aab41c2daa" exitCode=0 Apr 21 14:56:12.933295 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:12.932815 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" event={"ID":"cf009889-4a60-4449-8425-a8c15708e69e","Type":"ContainerDied","Data":"5cad92ac8a0145a0ceef527583ef3a4bdcae6c7ff2028acc897093aab41c2daa"} Apr 21 14:56:13.594474 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:13.594435 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:13.594624 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.594604 2610 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:56:13.594691 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.594682 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert podName:8033b471-ca39-425f-9cbb-cf56b370a5a2 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:17.594665982 +0000 UTC m=+41.578558008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-c5gp2" (UID: "8033b471-ca39-425f-9cbb-cf56b370a5a2") : secret "networking-console-plugin-cert" not found Apr 21 14:56:13.695591 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:13.695539 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:13.695710 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:13.695637 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:13.695710 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:13.695660 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:13.695710 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.695690 2610 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:13.695802 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.695754 2610 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:13.695802 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.695768 2610 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:13.695802 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.695783 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert podName:40ebea87-6126-42fa-bcf7-027f7fbce419 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:17.69576321 +0000 UTC m=+41.679655237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert") pod "ingress-canary-6vmnw" (UID: "40ebea87-6126-42fa-bcf7-027f7fbce419") : secret "canary-serving-cert" not found Apr 21 14:56:13.695802 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.695784 2610 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67b4567bdb-m7dt5: secret "image-registry-tls" not found Apr 21 14:56:13.695925 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.695803 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls podName:17ba6101-b1f6-412d-b361-2276f610226b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:17.69579398 +0000 UTC m=+41.679686007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls") pod "dns-default-mhp6p" (UID: "17ba6101-b1f6-412d-b361-2276f610226b") : secret "dns-default-metrics-tls" not found Apr 21 14:56:13.695925 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:13.695823 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls podName:5ac89af4-5925-4a52-a694-31a92b841ed6 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:17.695812244 +0000 UTC m=+41.679704270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls") pod "image-registry-67b4567bdb-m7dt5" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6") : secret "image-registry-tls" not found Apr 21 14:56:13.937501 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:13.937471 2610 generic.go:358] "Generic (PLEG): container finished" podID="cf009889-4a60-4449-8425-a8c15708e69e" containerID="ae2ec5aea0af46e1f434c5ec1873874745b78ef676d7c172d115e28d0745e685" exitCode=0 Apr 21 14:56:13.937865 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:13.937539 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" event={"ID":"cf009889-4a60-4449-8425-a8c15708e69e","Type":"ContainerDied","Data":"ae2ec5aea0af46e1f434c5ec1873874745b78ef676d7c172d115e28d0745e685"} Apr 21 14:56:14.942683 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:14.942649 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" event={"ID":"cf009889-4a60-4449-8425-a8c15708e69e","Type":"ContainerStarted","Data":"58fb024b822a6399fd74484fde38471a08c796ee84b61d4caf20939605dcc6b8"} Apr 21 14:56:14.967661 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:14.967617 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kjdc5" podStartSLOduration=5.039391573 podStartE2EDuration="38.967605944s" podCreationTimestamp="2026-04-21 14:55:36 +0000 UTC" firstStartedPulling="2026-04-21 14:55:37.885661023 +0000 UTC m=+1.869553048" lastFinishedPulling="2026-04-21 14:56:11.813875389 +0000 UTC m=+35.797767419" observedRunningTime="2026-04-21 14:56:14.96715007 +0000 UTC m=+38.951042122" watchObservedRunningTime="2026-04-21 14:56:14.967605944 +0000 UTC m=+38.951497992" Apr 21 14:56:17.625044 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:17.625001 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:17.625401 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.625140 2610 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:56:17.625401 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.625201 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert podName:8033b471-ca39-425f-9cbb-cf56b370a5a2 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:25.625183075 +0000 UTC m=+49.609075115 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-c5gp2" (UID: "8033b471-ca39-425f-9cbb-cf56b370a5a2") : secret "networking-console-plugin-cert" not found Apr 21 14:56:17.726291 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:17.726261 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:17.726291 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:17.726295 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:17.726480 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:17.726343 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:17.726480 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.726402 2610 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:17.726480 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.726425 2610 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67b4567bdb-m7dt5: secret "image-registry-tls" not found Apr 21 14:56:17.726480 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.726434 2610 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:17.726480 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.726458 2610 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:17.726480 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.726482 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls podName:5ac89af4-5925-4a52-a694-31a92b841ed6 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:25.726465221 +0000 UTC m=+49.710357251 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls") pod "image-registry-67b4567bdb-m7dt5" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6") : secret "image-registry-tls" not found Apr 21 14:56:17.726692 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.726504 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls podName:17ba6101-b1f6-412d-b361-2276f610226b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:25.726491007 +0000 UTC m=+49.710383037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls") pod "dns-default-mhp6p" (UID: "17ba6101-b1f6-412d-b361-2276f610226b") : secret "dns-default-metrics-tls" not found Apr 21 14:56:17.726692 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:17.726516 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert podName:40ebea87-6126-42fa-bcf7-027f7fbce419 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:25.726511084 +0000 UTC m=+49.710403110 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert") pod "ingress-canary-6vmnw" (UID: "40ebea87-6126-42fa-bcf7-027f7fbce419") : secret "canary-serving-cert" not found Apr 21 14:56:25.686176 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:25.686128 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:25.686623 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.686289 2610 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:56:25.686623 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.686359 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert podName:8033b471-ca39-425f-9cbb-cf56b370a5a2 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:41.686342674 +0000 UTC m=+65.670234701 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-c5gp2" (UID: "8033b471-ca39-425f-9cbb-cf56b370a5a2") : secret "networking-console-plugin-cert" not found Apr 21 14:56:25.787440 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:25.787398 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:25.787638 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:25.787469 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:25.787638 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:25.787489 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:25.787638 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.787569 2610 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:25.787788 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.787648 2610 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:25.787788 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.787664 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert podName:40ebea87-6126-42fa-bcf7-027f7fbce419 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:41.787648359 +0000 UTC m=+65.771540385 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert") pod "ingress-canary-6vmnw" (UID: "40ebea87-6126-42fa-bcf7-027f7fbce419") : secret "canary-serving-cert" not found Apr 21 14:56:25.787788 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.787666 2610 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67b4567bdb-m7dt5: secret "image-registry-tls" not found Apr 21 14:56:25.787788 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.787594 2610 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:25.787788 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.787723 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls podName:17ba6101-b1f6-412d-b361-2276f610226b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:41.787708945 +0000 UTC m=+65.771600971 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls") pod "dns-default-mhp6p" (UID: "17ba6101-b1f6-412d-b361-2276f610226b") : secret "dns-default-metrics-tls" not found Apr 21 14:56:25.787788 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:25.787736 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls podName:5ac89af4-5925-4a52-a694-31a92b841ed6 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:41.787730154 +0000 UTC m=+65.771622179 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls") pod "image-registry-67b4567bdb-m7dt5" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6") : secret "image-registry-tls" not found Apr 21 14:56:26.491900 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:26.491859 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:26.495078 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:26.495057 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1649b770-32f3-4c98-9e33-13d820fcd898-original-pull-secret\") pod \"global-pull-secret-syncer-h5rsz\" (UID: \"1649b770-32f3-4c98-9e33-13d820fcd898\") " pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:26.636298 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:26.636263 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5rsz" Apr 21 14:56:26.782538 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:26.782476 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h5rsz"] Apr 21 14:56:26.969940 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:26.969904 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h5rsz" event={"ID":"1649b770-32f3-4c98-9e33-13d820fcd898","Type":"ContainerStarted","Data":"196ce4ebfdf6d3867ce61b7c711389d827a8cc13b783a75570af028892d4908d"} Apr 21 14:56:31.981072 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:31.980987 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h5rsz" event={"ID":"1649b770-32f3-4c98-9e33-13d820fcd898","Type":"ContainerStarted","Data":"3532d799b19763ffb6762109270ebd04cae65ba4d57861d480c750e5ff49e3a7"} Apr 21 14:56:31.994730 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:31.994672 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-h5rsz" podStartSLOduration=33.174780539 podStartE2EDuration="37.994659423s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:56:26.792302017 +0000 UTC m=+50.776194059" lastFinishedPulling="2026-04-21 14:56:31.612180918 +0000 UTC m=+55.596072943" observedRunningTime="2026-04-21 14:56:31.994242911 +0000 UTC m=+55.978134958" watchObservedRunningTime="2026-04-21 14:56:31.994659423 +0000 UTC m=+55.978551471" Apr 21 14:56:34.922293 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:34.922266 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fqshv" Apr 21 14:56:41.314039 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.313991 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:56:41.316209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.316187 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:56:41.324635 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.324612 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:56:41.324735 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.324695 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:45.32467307 +0000 UTC m=+129.308565110 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : secret "metrics-daemon-secret" not found Apr 21 14:56:41.515162 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.515122 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:41.517278 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.517257 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:56:41.527276 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.527256 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:56:41.539023 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.539000 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmmhw\" (UniqueName: \"kubernetes.io/projected/3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1-kube-api-access-jmmhw\") pod \"network-check-target-gphbk\" (UID: \"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1\") " pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:41.643402 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.643327 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sdz22\"" Apr 21 14:56:41.651867 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.651849 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:41.717277 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.717249 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:56:41.717396 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.717382 2610 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:56:41.717457 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.717445 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert podName:8033b471-ca39-425f-9cbb-cf56b370a5a2 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:13.717429106 +0000 UTC m=+97.701321153 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-c5gp2" (UID: "8033b471-ca39-425f-9cbb-cf56b370a5a2") : secret "networking-console-plugin-cert" not found Apr 21 14:56:41.774557 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.774524 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gphbk"] Apr 21 14:56:41.778639 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:56:41.778603 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a01b39a_d3cb_4c13_ba47_ae57c73ab5c1.slice/crio-56b08059fe7fcce3939f4a162bb2945409fb93e376668773f2e508ff46945779 WatchSource:0}: Error finding container 56b08059fe7fcce3939f4a162bb2945409fb93e376668773f2e508ff46945779: Status 404 returned error can't find the container with id 56b08059fe7fcce3939f4a162bb2945409fb93e376668773f2e508ff46945779 Apr 21 14:56:41.817677 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.817644 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:56:41.817808 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.817679 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:56:41.817808 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.817713 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:56:41.817808 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.817803 2610 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:41.817993 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.817809 2610 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:41.817993 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.817814 2610 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:41.817993 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.817838 2610 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67b4567bdb-m7dt5: secret "image-registry-tls" not found Apr 21 14:56:41.817993 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.817858 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert podName:40ebea87-6126-42fa-bcf7-027f7fbce419 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:13.817845085 +0000 UTC m=+97.801737111 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert") pod "ingress-canary-6vmnw" (UID: "40ebea87-6126-42fa-bcf7-027f7fbce419") : secret "canary-serving-cert" not found Apr 21 14:56:41.817993 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.817871 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls podName:17ba6101-b1f6-412d-b361-2276f610226b nodeName:}" failed. No retries permitted until 2026-04-21 14:57:13.817865371 +0000 UTC m=+97.801757396 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls") pod "dns-default-mhp6p" (UID: "17ba6101-b1f6-412d-b361-2276f610226b") : secret "dns-default-metrics-tls" not found Apr 21 14:56:41.817993 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:56:41.817896 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls podName:5ac89af4-5925-4a52-a694-31a92b841ed6 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:13.817877871 +0000 UTC m=+97.801769902 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls") pod "image-registry-67b4567bdb-m7dt5" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6") : secret "image-registry-tls" not found Apr 21 14:56:41.999668 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:41.999636 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gphbk" event={"ID":"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1","Type":"ContainerStarted","Data":"56b08059fe7fcce3939f4a162bb2945409fb93e376668773f2e508ff46945779"} Apr 21 14:56:45.006277 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:45.006238 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gphbk" event={"ID":"3a01b39a-d3cb-4c13-ba47-ae57c73ab5c1","Type":"ContainerStarted","Data":"f5da4af42a402a996747a2839f35932966df9c837b1a872c39858e5001ef4d3a"} Apr 21 14:56:45.006673 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:45.006364 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:56:45.020341 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:56:45.020289 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-gphbk" podStartSLOduration=65.343218472 podStartE2EDuration="1m8.020273141s" podCreationTimestamp="2026-04-21 14:55:37 +0000 UTC" firstStartedPulling="2026-04-21 14:56:41.780996612 +0000 UTC m=+65.764888638" lastFinishedPulling="2026-04-21 14:56:44.45805128 +0000 UTC m=+68.441943307" observedRunningTime="2026-04-21 14:56:45.019999765 +0000 UTC m=+69.003891813" watchObservedRunningTime="2026-04-21 14:56:45.020273141 +0000 UTC m=+69.004165189" Apr 21 14:57:13.746133 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:13.746100 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:57:13.746507 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.746240 2610 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:57:13.746507 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.746319 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert podName:8033b471-ca39-425f-9cbb-cf56b370a5a2 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:17.746302618 +0000 UTC m=+161.730194648 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-c5gp2" (UID: "8033b471-ca39-425f-9cbb-cf56b370a5a2") : secret "networking-console-plugin-cert" not found Apr 21 14:57:13.846869 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:13.846836 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:57:13.847014 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:13.846877 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:57:13.847014 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.846972 2610 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:57:13.847014 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.846979 2610 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:57:13.847014 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.846996 2610 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67b4567bdb-m7dt5: secret "image-registry-tls" not found Apr 21 14:57:13.847186 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:13.847013 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:57:13.847186 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.847027 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls podName:17ba6101-b1f6-412d-b361-2276f610226b nodeName:}" failed. No retries permitted until 2026-04-21 14:58:17.847013854 +0000 UTC m=+161.830905884 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls") pod "dns-default-mhp6p" (UID: "17ba6101-b1f6-412d-b361-2276f610226b") : secret "dns-default-metrics-tls" not found Apr 21 14:57:13.847186 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.847059 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls podName:5ac89af4-5925-4a52-a694-31a92b841ed6 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:17.847042329 +0000 UTC m=+161.830934357 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls") pod "image-registry-67b4567bdb-m7dt5" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6") : secret "image-registry-tls" not found Apr 21 14:57:13.847186 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.847076 2610 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:57:13.847186 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:13.847106 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert podName:40ebea87-6126-42fa-bcf7-027f7fbce419 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:17.847096563 +0000 UTC m=+161.830988590 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert") pod "ingress-canary-6vmnw" (UID: "40ebea87-6126-42fa-bcf7-027f7fbce419") : secret "canary-serving-cert" not found Apr 21 14:57:16.010487 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:16.010460 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-gphbk" Apr 21 14:57:44.065417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.065380 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-76d7d6f776-nvlj4"] Apr 21 14:57:44.068172 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.068157 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.070119 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.070097 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 14:57:44.070281 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.070104 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-46rkc\"" Apr 21 14:57:44.070281 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.070101 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 14:57:44.070468 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.070450 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 14:57:44.070515 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.070470 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 14:57:44.070515 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.070499 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 14:57:44.070634 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.070607 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 14:57:44.079104 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.079076 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-76d7d6f776-nvlj4"] Apr 21 14:57:44.158082 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.158040 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-fnrh5"] Apr 21 14:57:44.161076 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.161056 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4"] Apr 21 14:57:44.161222 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.161206 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.163224 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.163199 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 14:57:44.163358 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.163281 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 14:57:44.163358 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.163296 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 14:57:44.163358 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.163309 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-kv55b\"" Apr 21 14:57:44.163524 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.163362 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 14:57:44.163939 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.163917 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.165869 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.165844 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 14:57:44.168882 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.168864 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 14:57:44.169487 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.169468 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 14:57:44.169640 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.169623 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hc6nf\"" Apr 21 14:57:44.169722 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.169637 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 14:57:44.173490 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.173099 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-fnrh5"] Apr 21 14:57:44.173789 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.173767 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-stats-auth\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.173871 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.173851 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2hs5\" (UniqueName: \"kubernetes.io/projected/e22e5723-18d9-4194-867b-028f5e78e14d-kube-api-access-l2hs5\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.173935 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.173918 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.173991 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.173951 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.173991 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.173979 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-default-certificate\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.174638 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.174357 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4"] Apr 21 14:57:44.174911 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.174888 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 14:57:44.274760 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274720 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-serving-cert\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.274760 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274769 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b4238647-950c-4b0e-ac27-6e6a0040c6dc-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.274995 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274794 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.274995 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274815 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-default-certificate\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.274995 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274843 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rslxf\" (UniqueName: \"kubernetes.io/projected/b4238647-950c-4b0e-ac27-6e6a0040c6dc-kube-api-access-rslxf\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.274995 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274858 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-tmp\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.274995 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274873 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-snapshots\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.274995 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274912 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.274995 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.274930 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-stats-auth\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.274995 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.274956 2610 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 14:57:44.275352 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.275023 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:44.775003739 +0000 UTC m=+128.758895803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : secret "router-metrics-certs-default" not found Apr 21 14:57:44.275352 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.275019 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.275352 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.275064 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.275352 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.275117 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2hs5\" (UniqueName: \"kubernetes.io/projected/e22e5723-18d9-4194-867b-028f5e78e14d-kube-api-access-l2hs5\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.275352 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.275148 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc7zx\" (UniqueName: \"kubernetes.io/projected/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-kube-api-access-nc7zx\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.275352 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.275183 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-service-ca-bundle\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.275352 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.275218 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:44.77520073 +0000 UTC m=+128.759092757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : configmap references non-existent config key: service-ca.crt Apr 21 14:57:44.277382 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.277362 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-stats-auth\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.277475 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.277404 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-default-certificate\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.283748 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.283723 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2hs5\" (UniqueName: \"kubernetes.io/projected/e22e5723-18d9-4194-867b-028f5e78e14d-kube-api-access-l2hs5\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.376598 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.376487 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.376598 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.376558 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nc7zx\" (UniqueName: \"kubernetes.io/projected/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-kube-api-access-nc7zx\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.376911 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.376624 2610 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:44.376911 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.376657 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-service-ca-bundle\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.376911 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.376706 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls podName:b4238647-950c-4b0e-ac27-6e6a0040c6dc nodeName:}" failed. No retries permitted until 2026-04-21 14:57:44.876686218 +0000 UTC m=+128.860578249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s2zz4" (UID: "b4238647-950c-4b0e-ac27-6e6a0040c6dc") : secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:44.376911 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.376763 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-serving-cert\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.376911 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.376810 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b4238647-950c-4b0e-ac27-6e6a0040c6dc-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.376911 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.376902 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rslxf\" (UniqueName: \"kubernetes.io/projected/b4238647-950c-4b0e-ac27-6e6a0040c6dc-kube-api-access-rslxf\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.377217 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.376929 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-tmp\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.377217 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.376960 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-snapshots\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.377217 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.377019 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.377373 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.377232 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-tmp\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.377373 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.377233 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-service-ca-bundle\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.377531 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.377513 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b4238647-950c-4b0e-ac27-6e6a0040c6dc-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.377598 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.377521 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-snapshots\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.378293 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.378276 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.379155 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.379139 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-serving-cert\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.384872 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.384849 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rslxf\" (UniqueName: \"kubernetes.io/projected/b4238647-950c-4b0e-ac27-6e6a0040c6dc-kube-api-access-rslxf\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.384956 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.384920 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc7zx\" (UniqueName: \"kubernetes.io/projected/e3d66dda-7e69-48a9-a23b-ca9cdad31f2b-kube-api-access-nc7zx\") pod \"insights-operator-585dfdc468-fnrh5\" (UID: \"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b\") " pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.470968 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.470935 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-fnrh5" Apr 21 14:57:44.583589 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.583535 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-fnrh5"] Apr 21 14:57:44.586309 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:57:44.586286 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3d66dda_7e69_48a9_a23b_ca9cdad31f2b.slice/crio-9eea0b46c217f318f38f671758cc6272b9dea7278d57a6b04726173bc687f34a WatchSource:0}: Error finding container 9eea0b46c217f318f38f671758cc6272b9dea7278d57a6b04726173bc687f34a: Status 404 returned error can't find the container with id 9eea0b46c217f318f38f671758cc6272b9dea7278d57a6b04726173bc687f34a Apr 21 14:57:44.781736 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.781699 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.781894 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.781785 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:44.781894 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.781856 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:45.781839764 +0000 UTC m=+129.765731793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : configmap references non-existent config key: service-ca.crt Apr 21 14:57:44.781970 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.781898 2610 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 14:57:44.782002 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.781976 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:45.781962478 +0000 UTC m=+129.765854503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : secret "router-metrics-certs-default" not found Apr 21 14:57:44.883041 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:44.883013 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:44.883194 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.883154 2610 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:44.883233 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:44.883216 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls podName:b4238647-950c-4b0e-ac27-6e6a0040c6dc nodeName:}" failed. No retries permitted until 2026-04-21 14:57:45.883201566 +0000 UTC m=+129.867093598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s2zz4" (UID: "b4238647-950c-4b0e-ac27-6e6a0040c6dc") : secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:45.119556 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:45.119466 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-fnrh5" event={"ID":"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b","Type":"ContainerStarted","Data":"9eea0b46c217f318f38f671758cc6272b9dea7278d57a6b04726173bc687f34a"} Apr 21 14:57:45.387525 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:45.387435 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:57:45.387715 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:45.387625 2610 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:57:45.387715 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:45.387711 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs podName:9b064625-50f7-4c6a-be44-9aed34a00b26 nodeName:}" failed. No retries permitted until 2026-04-21 14:59:47.387688513 +0000 UTC m=+251.371580555 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs") pod "network-metrics-daemon-mtdkf" (UID: "9b064625-50f7-4c6a-be44-9aed34a00b26") : secret "metrics-daemon-secret" not found Apr 21 14:57:45.791156 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:45.791121 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:45.791303 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:45.791191 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:45.791303 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:45.791272 2610 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 14:57:45.791374 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:45.791281 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:47.791267945 +0000 UTC m=+131.775159971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : configmap references non-existent config key: service-ca.crt Apr 21 14:57:45.791374 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:45.791339 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:47.791320769 +0000 UTC m=+131.775212805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : secret "router-metrics-certs-default" not found Apr 21 14:57:45.892491 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:45.892455 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:45.892660 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:45.892639 2610 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:45.892730 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:45.892718 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls podName:b4238647-950c-4b0e-ac27-6e6a0040c6dc nodeName:}" failed. No retries permitted until 2026-04-21 14:57:47.892697572 +0000 UTC m=+131.876589601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s2zz4" (UID: "b4238647-950c-4b0e-ac27-6e6a0040c6dc") : secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:47.125769 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:47.125720 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-fnrh5" event={"ID":"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b","Type":"ContainerStarted","Data":"c2167a2d5db0f95ca0813b845cc364107b930f190179165ec93e18e426f647bb"} Apr 21 14:57:47.144595 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:47.144526 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-fnrh5" podStartSLOduration=1.357278375 podStartE2EDuration="3.144506898s" podCreationTimestamp="2026-04-21 14:57:44 +0000 UTC" firstStartedPulling="2026-04-21 14:57:44.587990105 +0000 UTC m=+128.571882132" lastFinishedPulling="2026-04-21 14:57:46.375218614 +0000 UTC m=+130.359110655" observedRunningTime="2026-04-21 14:57:47.1427653 +0000 UTC m=+131.126657348" watchObservedRunningTime="2026-04-21 14:57:47.144506898 +0000 UTC m=+131.128398948" Apr 21 14:57:47.808984 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:47.808948 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:47.809149 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:47.809058 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:47.809149 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:47.809120 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:51.809101511 +0000 UTC m=+135.792993552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : configmap references non-existent config key: service-ca.crt Apr 21 14:57:47.809237 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:47.809174 2610 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 14:57:47.809237 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:47.809226 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:51.809214357 +0000 UTC m=+135.793106388 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : secret "router-metrics-certs-default" not found Apr 21 14:57:47.909588 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:47.909549 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:47.909698 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:47.909679 2610 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:47.909754 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:47.909744 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls podName:b4238647-950c-4b0e-ac27-6e6a0040c6dc nodeName:}" failed. No retries permitted until 2026-04-21 14:57:51.90973065 +0000 UTC m=+135.893622677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s2zz4" (UID: "b4238647-950c-4b0e-ac27-6e6a0040c6dc") : secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:49.360465 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:49.360438 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n8qc9_b48f3832-4ecd-46ba-bde8-35a4180bf3ca/dns-node-resolver/0.log" Apr 21 14:57:50.360660 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:50.360629 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z6hwp_72237d81-3f9e-4b04-a299-0acb0dd6604c/node-ca/0.log" Apr 21 14:57:51.843376 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:51.843324 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:51.843791 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:51.843420 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:51.843791 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:51.843492 2610 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 14:57:51.843791 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:51.843549 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:59.843536664 +0000 UTC m=+143.827428690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : configmap references non-existent config key: service-ca.crt Apr 21 14:57:51.843791 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:51.843566 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:57:59.843559519 +0000 UTC m=+143.827451545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : secret "router-metrics-certs-default" not found Apr 21 14:57:51.943995 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:51.943959 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:57:51.944163 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:51.944131 2610 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:51.944226 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:51.944214 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls podName:b4238647-950c-4b0e-ac27-6e6a0040c6dc nodeName:}" failed. No retries permitted until 2026-04-21 14:57:59.944193531 +0000 UTC m=+143.928085571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s2zz4" (UID: "b4238647-950c-4b0e-ac27-6e6a0040c6dc") : secret "cluster-monitoring-operator-tls" not found Apr 21 14:57:53.830305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.830269 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2k25j"] Apr 21 14:57:53.833245 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.833229 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:53.834994 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.834955 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 14:57:53.835156 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.835135 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:57:53.835230 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.835157 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-pdjrf\"" Apr 21 14:57:53.835230 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.835197 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 14:57:53.835230 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.835139 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 14:57:53.839731 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.839710 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 14:57:53.843337 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.843321 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2k25j"] Apr 21 14:57:53.959956 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.959918 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d982beb0-1451-48ab-b61a-060b6d23cfc7-serving-cert\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:53.960118 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.959999 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d982beb0-1451-48ab-b61a-060b6d23cfc7-config\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:53.960118 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.960028 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqbl\" (UniqueName: \"kubernetes.io/projected/d982beb0-1451-48ab-b61a-060b6d23cfc7-kube-api-access-rrqbl\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:53.960118 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:53.960073 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d982beb0-1451-48ab-b61a-060b6d23cfc7-trusted-ca\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.061163 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.061117 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d982beb0-1451-48ab-b61a-060b6d23cfc7-trusted-ca\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.061356 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.061201 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d982beb0-1451-48ab-b61a-060b6d23cfc7-serving-cert\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.061356 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.061247 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d982beb0-1451-48ab-b61a-060b6d23cfc7-config\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.061356 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.061274 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqbl\" (UniqueName: \"kubernetes.io/projected/d982beb0-1451-48ab-b61a-060b6d23cfc7-kube-api-access-rrqbl\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.061903 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.061882 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d982beb0-1451-48ab-b61a-060b6d23cfc7-trusted-ca\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.061903 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.061891 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d982beb0-1451-48ab-b61a-060b6d23cfc7-config\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.063699 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.063681 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d982beb0-1451-48ab-b61a-060b6d23cfc7-serving-cert\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.072444 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.072417 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqbl\" (UniqueName: \"kubernetes.io/projected/d982beb0-1451-48ab-b61a-060b6d23cfc7-kube-api-access-rrqbl\") pod \"console-operator-9d4b6777b-2k25j\" (UID: \"d982beb0-1451-48ab-b61a-060b6d23cfc7\") " pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.087188 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.087129 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf"] Apr 21 14:57:54.090188 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.090171 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf" Apr 21 14:57:54.092564 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.092542 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 14:57:54.092661 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.092542 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:57:54.092661 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.092592 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-vlkbn\"" Apr 21 14:57:54.102128 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.102108 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf"] Apr 21 14:57:54.142868 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.142839 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:57:54.162423 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.162378 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p6vp\" (UniqueName: \"kubernetes.io/projected/0de3ebe8-149d-4997-8a88-c28ce1dbe39d-kube-api-access-2p6vp\") pod \"volume-data-source-validator-7c6cbb6c87-v7qpf\" (UID: \"0de3ebe8-149d-4997-8a88-c28ce1dbe39d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf" Apr 21 14:57:54.257424 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.257390 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2k25j"] Apr 21 14:57:54.261763 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:57:54.261735 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd982beb0_1451_48ab_b61a_060b6d23cfc7.slice/crio-c910769d90e9a12cfe8353be46296f07f22650e46cc0d0a5db5fc2361e4b295d WatchSource:0}: Error finding container c910769d90e9a12cfe8353be46296f07f22650e46cc0d0a5db5fc2361e4b295d: Status 404 returned error can't find the container with id c910769d90e9a12cfe8353be46296f07f22650e46cc0d0a5db5fc2361e4b295d Apr 21 14:57:54.263122 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.263102 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p6vp\" (UniqueName: \"kubernetes.io/projected/0de3ebe8-149d-4997-8a88-c28ce1dbe39d-kube-api-access-2p6vp\") pod \"volume-data-source-validator-7c6cbb6c87-v7qpf\" (UID: \"0de3ebe8-149d-4997-8a88-c28ce1dbe39d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf" Apr 21 14:57:54.272057 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.272038 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p6vp\" (UniqueName: \"kubernetes.io/projected/0de3ebe8-149d-4997-8a88-c28ce1dbe39d-kube-api-access-2p6vp\") pod \"volume-data-source-validator-7c6cbb6c87-v7qpf\" (UID: \"0de3ebe8-149d-4997-8a88-c28ce1dbe39d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf" Apr 21 14:57:54.399798 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.399721 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf" Apr 21 14:57:54.512305 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:54.512252 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf"] Apr 21 14:57:54.515706 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:57:54.515676 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de3ebe8_149d_4997_8a88_c28ce1dbe39d.slice/crio-d187b758604bc224119b4cfab99e4fadc99d402a6f043637ff367b528ede2c56 WatchSource:0}: Error finding container d187b758604bc224119b4cfab99e4fadc99d402a6f043637ff367b528ede2c56: Status 404 returned error can't find the container with id d187b758604bc224119b4cfab99e4fadc99d402a6f043637ff367b528ede2c56 Apr 21 14:57:55.141395 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:55.141349 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf" event={"ID":"0de3ebe8-149d-4997-8a88-c28ce1dbe39d","Type":"ContainerStarted","Data":"d187b758604bc224119b4cfab99e4fadc99d402a6f043637ff367b528ede2c56"} Apr 21 14:57:55.142704 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:55.142660 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" event={"ID":"d982beb0-1451-48ab-b61a-060b6d23cfc7","Type":"ContainerStarted","Data":"c910769d90e9a12cfe8353be46296f07f22650e46cc0d0a5db5fc2361e4b295d"} Apr 21 14:57:57.148007 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:57.147969 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf" event={"ID":"0de3ebe8-149d-4997-8a88-c28ce1dbe39d","Type":"ContainerStarted","Data":"ca85c53ff9f71ca42c7388041ac3adb36d9df610da5e063be9fd6df2416fb500"} Apr 21 14:57:57.149450 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:57.149432 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/0.log" Apr 21 14:57:57.149526 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:57.149466 2610 generic.go:358] "Generic (PLEG): container finished" podID="d982beb0-1451-48ab-b61a-060b6d23cfc7" containerID="95a491d6a5c007458eadc8f93f32a4767598dfdca6e82d7b9d470bdd1dbba6a7" exitCode=255 Apr 21 14:57:57.149526 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:57.149500 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" event={"ID":"d982beb0-1451-48ab-b61a-060b6d23cfc7","Type":"ContainerDied","Data":"95a491d6a5c007458eadc8f93f32a4767598dfdca6e82d7b9d470bdd1dbba6a7"} Apr 21 14:57:57.149748 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:57.149732 2610 scope.go:117] "RemoveContainer" containerID="95a491d6a5c007458eadc8f93f32a4767598dfdca6e82d7b9d470bdd1dbba6a7" Apr 21 14:57:57.170489 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:57.170451 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v7qpf" podStartSLOduration=1.392346762 podStartE2EDuration="3.170437042s" podCreationTimestamp="2026-04-21 14:57:54 +0000 UTC" firstStartedPulling="2026-04-21 14:57:54.517366265 +0000 UTC m=+138.501258292" lastFinishedPulling="2026-04-21 14:57:56.295456532 +0000 UTC m=+140.279348572" observedRunningTime="2026-04-21 14:57:57.170348788 +0000 UTC m=+141.154240847" watchObservedRunningTime="2026-04-21 14:57:57.170437042 +0000 UTC m=+141.154329089" Apr 21 14:57:58.154304 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.154278 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/1.log" Apr 21 14:57:58.154790 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.154609 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/0.log" Apr 21 14:57:58.154790 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.154644 2610 generic.go:358] "Generic (PLEG): container finished" podID="d982beb0-1451-48ab-b61a-060b6d23cfc7" containerID="7cb4830ffd2b70ac303a3e1faf89443ce0b31c21429e5c237e8f3968c1f1822f" exitCode=255 Apr 21 14:57:58.154790 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.154682 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" event={"ID":"d982beb0-1451-48ab-b61a-060b6d23cfc7","Type":"ContainerDied","Data":"7cb4830ffd2b70ac303a3e1faf89443ce0b31c21429e5c237e8f3968c1f1822f"} Apr 21 14:57:58.154790 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.154720 2610 scope.go:117] "RemoveContainer" containerID="95a491d6a5c007458eadc8f93f32a4767598dfdca6e82d7b9d470bdd1dbba6a7" Apr 21 14:57:58.155021 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.155006 2610 scope.go:117] "RemoveContainer" containerID="7cb4830ffd2b70ac303a3e1faf89443ce0b31c21429e5c237e8f3968c1f1822f" Apr 21 14:57:58.155214 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:58.155194 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2k25j_openshift-console-operator(d982beb0-1451-48ab-b61a-060b6d23cfc7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" podUID="d982beb0-1451-48ab-b61a-060b6d23cfc7" Apr 21 14:57:58.899596 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.899531 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n"] Apr 21 14:57:58.902857 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.902834 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" Apr 21 14:57:58.905030 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.905005 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 14:57:58.905322 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.905305 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-pb455\"" Apr 21 14:57:58.905373 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.905341 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 14:57:58.911776 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:58.911753 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n"] Apr 21 14:57:59.011334 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.011294 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz77r\" (UniqueName: \"kubernetes.io/projected/9c75de96-f2ee-425b-89e6-419195efd0a8-kube-api-access-mz77r\") pod \"migrator-74bb7799d9-wjc6n\" (UID: \"9c75de96-f2ee-425b-89e6-419195efd0a8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" Apr 21 14:57:59.112383 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.112337 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz77r\" (UniqueName: \"kubernetes.io/projected/9c75de96-f2ee-425b-89e6-419195efd0a8-kube-api-access-mz77r\") pod \"migrator-74bb7799d9-wjc6n\" (UID: \"9c75de96-f2ee-425b-89e6-419195efd0a8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" Apr 21 14:57:59.120635 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.120606 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz77r\" (UniqueName: \"kubernetes.io/projected/9c75de96-f2ee-425b-89e6-419195efd0a8-kube-api-access-mz77r\") pod \"migrator-74bb7799d9-wjc6n\" (UID: \"9c75de96-f2ee-425b-89e6-419195efd0a8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" Apr 21 14:57:59.158268 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.158193 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/1.log" Apr 21 14:57:59.158629 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.158535 2610 scope.go:117] "RemoveContainer" containerID="7cb4830ffd2b70ac303a3e1faf89443ce0b31c21429e5c237e8f3968c1f1822f" Apr 21 14:57:59.158752 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:59.158735 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2k25j_openshift-console-operator(d982beb0-1451-48ab-b61a-060b6d23cfc7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" podUID="d982beb0-1451-48ab-b61a-060b6d23cfc7" Apr 21 14:57:59.212353 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.212300 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" Apr 21 14:57:59.326831 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.326801 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n"] Apr 21 14:57:59.329529 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:57:59.329503 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c75de96_f2ee_425b_89e6_419195efd0a8.slice/crio-b204dd28b2c445cce3602028867cb906a831a3a0553dad33a67c16c3e1e8068b WatchSource:0}: Error finding container b204dd28b2c445cce3602028867cb906a831a3a0553dad33a67c16c3e1e8068b: Status 404 returned error can't find the container with id b204dd28b2c445cce3602028867cb906a831a3a0553dad33a67c16c3e1e8068b Apr 21 14:57:59.919288 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.919252 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:59.919466 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:57:59.919332 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:57:59.919466 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:59.919434 2610 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 14:57:59.919560 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:59.919505 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:58:15.919484457 +0000 UTC m=+159.903376489 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : configmap references non-existent config key: service-ca.crt Apr 21 14:57:59.919560 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:57:59.919525 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs podName:e22e5723-18d9-4194-867b-028f5e78e14d nodeName:}" failed. No retries permitted until 2026-04-21 14:58:15.91951814 +0000 UTC m=+159.903410166 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs") pod "router-default-76d7d6f776-nvlj4" (UID: "e22e5723-18d9-4194-867b-028f5e78e14d") : secret "router-metrics-certs-default" not found Apr 21 14:58:00.020392 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.020349 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:58:00.020559 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:00.020501 2610 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 14:58:00.020633 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:00.020588 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls podName:b4238647-950c-4b0e-ac27-6e6a0040c6dc nodeName:}" failed. No retries permitted until 2026-04-21 14:58:16.02055584 +0000 UTC m=+160.004447867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s2zz4" (UID: "b4238647-950c-4b0e-ac27-6e6a0040c6dc") : secret "cluster-monitoring-operator-tls" not found Apr 21 14:58:00.161943 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.161906 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" event={"ID":"9c75de96-f2ee-425b-89e6-419195efd0a8","Type":"ContainerStarted","Data":"b204dd28b2c445cce3602028867cb906a831a3a0553dad33a67c16c3e1e8068b"} Apr 21 14:58:00.652426 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.652338 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bbvdg"] Apr 21 14:58:00.655538 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.655509 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.657621 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.657603 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 14:58:00.657704 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.657606 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s4jk5\"" Apr 21 14:58:00.661264 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.661240 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 14:58:00.671982 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.671868 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bbvdg"] Apr 21 14:58:00.725695 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.725666 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bb217948-19df-46bd-9ef3-5c07750c4e03-data-volume\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.725823 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.725709 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bb217948-19df-46bd-9ef3-5c07750c4e03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.725823 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.725808 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.725905 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.725867 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bb217948-19df-46bd-9ef3-5c07750c4e03-crio-socket\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.725940 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.725912 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjcd\" (UniqueName: \"kubernetes.io/projected/bb217948-19df-46bd-9ef3-5c07750c4e03-kube-api-access-8cjcd\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.826481 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.826441 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.826702 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.826490 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bb217948-19df-46bd-9ef3-5c07750c4e03-crio-socket\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.826702 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.826509 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjcd\" (UniqueName: \"kubernetes.io/projected/bb217948-19df-46bd-9ef3-5c07750c4e03-kube-api-access-8cjcd\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.826702 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.826545 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bb217948-19df-46bd-9ef3-5c07750c4e03-data-volume\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.826702 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.826561 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bb217948-19df-46bd-9ef3-5c07750c4e03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.826702 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:00.826617 2610 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:00.826702 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:00.826688 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls podName:bb217948-19df-46bd-9ef3-5c07750c4e03 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:01.326669878 +0000 UTC m=+145.310561904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bbvdg" (UID: "bb217948-19df-46bd-9ef3-5c07750c4e03") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:00.826702 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.826616 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bb217948-19df-46bd-9ef3-5c07750c4e03-crio-socket\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.827061 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.826896 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bb217948-19df-46bd-9ef3-5c07750c4e03-data-volume\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.827203 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.827183 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bb217948-19df-46bd-9ef3-5c07750c4e03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:00.834736 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:00.834711 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjcd\" (UniqueName: \"kubernetes.io/projected/bb217948-19df-46bd-9ef3-5c07750c4e03-kube-api-access-8cjcd\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:01.166448 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:01.166414 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" event={"ID":"9c75de96-f2ee-425b-89e6-419195efd0a8","Type":"ContainerStarted","Data":"72b312d7b0715ec2e2868f1b53f268813bb0f13f4b6683a8ac4f91ca52210469"} Apr 21 14:58:01.166448 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:01.166449 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" event={"ID":"9c75de96-f2ee-425b-89e6-419195efd0a8","Type":"ContainerStarted","Data":"e3717e6c72bf5e9c34c7416888b25e36ade4a655c2b35e65b5336768440f2147"} Apr 21 14:58:01.189281 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:01.189223 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wjc6n" podStartSLOduration=2.156186145 podStartE2EDuration="3.18920785s" podCreationTimestamp="2026-04-21 14:57:58 +0000 UTC" firstStartedPulling="2026-04-21 14:57:59.331401679 +0000 UTC m=+143.315293708" lastFinishedPulling="2026-04-21 14:58:00.364423388 +0000 UTC m=+144.348315413" observedRunningTime="2026-04-21 14:58:01.1877834 +0000 UTC m=+145.171675447" watchObservedRunningTime="2026-04-21 14:58:01.18920785 +0000 UTC m=+145.173099898" Apr 21 14:58:01.331427 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:01.331370 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:01.331616 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:01.331524 2610 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:01.331616 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:01.331607 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls podName:bb217948-19df-46bd-9ef3-5c07750c4e03 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:02.331590004 +0000 UTC m=+146.315482043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bbvdg" (UID: "bb217948-19df-46bd-9ef3-5c07750c4e03") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:02.339925 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:02.339883 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:02.340378 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:02.340051 2610 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:02.340378 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:02.340121 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls podName:bb217948-19df-46bd-9ef3-5c07750c4e03 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:04.340105326 +0000 UTC m=+148.323997351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bbvdg" (UID: "bb217948-19df-46bd-9ef3-5c07750c4e03") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:04.143957 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:04.143907 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:58:04.144325 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:04.143969 2610 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:58:04.144325 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:04.144296 2610 scope.go:117] "RemoveContainer" containerID="7cb4830ffd2b70ac303a3e1faf89443ce0b31c21429e5c237e8f3968c1f1822f" Apr 21 14:58:04.144467 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:04.144450 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2k25j_openshift-console-operator(d982beb0-1451-48ab-b61a-060b6d23cfc7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" podUID="d982beb0-1451-48ab-b61a-060b6d23cfc7" Apr 21 14:58:04.356642 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:04.356608 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:04.356781 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:04.356739 2610 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:04.356836 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:04.356807 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls podName:bb217948-19df-46bd-9ef3-5c07750c4e03 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:08.356790616 +0000 UTC m=+152.340682649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bbvdg" (UID: "bb217948-19df-46bd-9ef3-5c07750c4e03") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:08.391146 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:08.391100 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:08.393587 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:08.393544 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bb217948-19df-46bd-9ef3-5c07750c4e03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbvdg\" (UID: \"bb217948-19df-46bd-9ef3-5c07750c4e03\") " pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:08.464199 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:08.464164 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bbvdg" Apr 21 14:58:08.579879 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:08.579849 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bbvdg"] Apr 21 14:58:08.582966 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:08.582938 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb217948_19df_46bd_9ef3_5c07750c4e03.slice/crio-ab8a86cb59aa9fb76c6278750731ad8ae4519f0a9bf1c4816e31d7821fd8d0f1 WatchSource:0}: Error finding container ab8a86cb59aa9fb76c6278750731ad8ae4519f0a9bf1c4816e31d7821fd8d0f1: Status 404 returned error can't find the container with id ab8a86cb59aa9fb76c6278750731ad8ae4519f0a9bf1c4816e31d7821fd8d0f1 Apr 21 14:58:09.187254 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:09.187223 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bbvdg" event={"ID":"bb217948-19df-46bd-9ef3-5c07750c4e03","Type":"ContainerStarted","Data":"85e78c83b823eae2462d2b99b148423c32081c35d35d111556fd16ac23a84e10"} Apr 21 14:58:09.187344 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:09.187258 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bbvdg" event={"ID":"bb217948-19df-46bd-9ef3-5c07750c4e03","Type":"ContainerStarted","Data":"9b48f4afdeb6a6a966586949d685833e7b8e93e5a212ed23ae5d34fcb5c2fb9f"} Apr 21 14:58:09.187344 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:09.187267 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bbvdg" event={"ID":"bb217948-19df-46bd-9ef3-5c07750c4e03","Type":"ContainerStarted","Data":"ab8a86cb59aa9fb76c6278750731ad8ae4519f0a9bf1c4816e31d7821fd8d0f1"} Apr 21 14:58:11.193724 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:11.193684 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bbvdg" event={"ID":"bb217948-19df-46bd-9ef3-5c07750c4e03","Type":"ContainerStarted","Data":"09d1264ffe862b0a762303f7ea6322b04b623c4c2e738a7878d4a3add4825610"} Apr 21 14:58:11.211120 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:11.211054 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bbvdg" podStartSLOduration=9.382266219 podStartE2EDuration="11.211040335s" podCreationTimestamp="2026-04-21 14:58:00 +0000 UTC" firstStartedPulling="2026-04-21 14:58:08.648769144 +0000 UTC m=+152.632661170" lastFinishedPulling="2026-04-21 14:58:10.477543251 +0000 UTC m=+154.461435286" observedRunningTime="2026-04-21 14:58:11.209975172 +0000 UTC m=+155.193867232" watchObservedRunningTime="2026-04-21 14:58:11.211040335 +0000 UTC m=+155.194932414" Apr 21 14:58:12.868355 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:12.868299 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" podUID="8033b471-ca39-425f-9cbb-cf56b370a5a2" Apr 21 14:58:12.898784 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:12.898741 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" Apr 21 14:58:12.910868 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:12.910833 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6vmnw" podUID="40ebea87-6126-42fa-bcf7-027f7fbce419" Apr 21 14:58:12.976700 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:12.976657 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mhp6p" podUID="17ba6101-b1f6-412d-b361-2276f610226b" Apr 21 14:58:13.200702 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:13.200624 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:58:13.200702 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:13.200624 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:58:13.200886 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:13.200630 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:58:14.631342 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:14.631308 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mtdkf" podUID="9b064625-50f7-4c6a-be44-9aed34a00b26" Apr 21 14:58:15.621150 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:15.621119 2610 scope.go:117] "RemoveContainer" containerID="7cb4830ffd2b70ac303a3e1faf89443ce0b31c21429e5c237e8f3968c1f1822f" Apr 21 14:58:15.955718 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:15.955622 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:15.955718 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:15.955728 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:15.956284 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:15.956263 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e22e5723-18d9-4194-867b-028f5e78e14d-service-ca-bundle\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:15.958121 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:15.958097 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e22e5723-18d9-4194-867b-028f5e78e14d-metrics-certs\") pod \"router-default-76d7d6f776-nvlj4\" (UID: \"e22e5723-18d9-4194-867b-028f5e78e14d\") " pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:16.056433 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.056392 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:58:16.058969 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.058948 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4238647-950c-4b0e-ac27-6e6a0040c6dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s2zz4\" (UID: \"b4238647-950c-4b0e-ac27-6e6a0040c6dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:58:16.176798 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.176761 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:16.209789 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.209723 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 14:58:16.210147 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.210130 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/1.log" Apr 21 14:58:16.210191 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.210174 2610 generic.go:358] "Generic (PLEG): container finished" podID="d982beb0-1451-48ab-b61a-060b6d23cfc7" containerID="3b66ad60f2cf263007cef300f868f5d7edaa69c093f970d1c0ffa446e9f49d6a" exitCode=255 Apr 21 14:58:16.210244 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.210227 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" event={"ID":"d982beb0-1451-48ab-b61a-060b6d23cfc7","Type":"ContainerDied","Data":"3b66ad60f2cf263007cef300f868f5d7edaa69c093f970d1c0ffa446e9f49d6a"} Apr 21 14:58:16.210285 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.210272 2610 scope.go:117] "RemoveContainer" containerID="7cb4830ffd2b70ac303a3e1faf89443ce0b31c21429e5c237e8f3968c1f1822f" Apr 21 14:58:16.210542 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.210523 2610 scope.go:117] "RemoveContainer" containerID="3b66ad60f2cf263007cef300f868f5d7edaa69c093f970d1c0ffa446e9f49d6a" Apr 21 14:58:16.210809 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:16.210774 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2k25j_openshift-console-operator(d982beb0-1451-48ab-b61a-060b6d23cfc7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" podUID="d982beb0-1451-48ab-b61a-060b6d23cfc7" Apr 21 14:58:16.277188 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.277162 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" Apr 21 14:58:16.300443 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.300413 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-76d7d6f776-nvlj4"] Apr 21 14:58:16.303932 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:16.303902 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode22e5723_18d9_4194_867b_028f5e78e14d.slice/crio-70bf70f68612bc06b404512bbf821e477fea107b90f9d194819161a2728783de WatchSource:0}: Error finding container 70bf70f68612bc06b404512bbf821e477fea107b90f9d194819161a2728783de: Status 404 returned error can't find the container with id 70bf70f68612bc06b404512bbf821e477fea107b90f9d194819161a2728783de Apr 21 14:58:16.417644 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:16.417605 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4"] Apr 21 14:58:16.422953 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:16.422924 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4238647_950c_4b0e_ac27_6e6a0040c6dc.slice/crio-6913b8f866d73d03edcd55c5e7a3491c3058bec620bcbb0d96f23c34348d343a WatchSource:0}: Error finding container 6913b8f866d73d03edcd55c5e7a3491c3058bec620bcbb0d96f23c34348d343a: Status 404 returned error can't find the container with id 6913b8f866d73d03edcd55c5e7a3491c3058bec620bcbb0d96f23c34348d343a Apr 21 14:58:17.214108 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.214067 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" event={"ID":"b4238647-950c-4b0e-ac27-6e6a0040c6dc","Type":"ContainerStarted","Data":"6913b8f866d73d03edcd55c5e7a3491c3058bec620bcbb0d96f23c34348d343a"} Apr 21 14:58:17.215350 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.215319 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" event={"ID":"e22e5723-18d9-4194-867b-028f5e78e14d","Type":"ContainerStarted","Data":"fed30c08d1d7183c7d157d417a627975670ad54d92a485345a4611a159d23bcd"} Apr 21 14:58:17.215483 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.215358 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" event={"ID":"e22e5723-18d9-4194-867b-028f5e78e14d","Type":"ContainerStarted","Data":"70bf70f68612bc06b404512bbf821e477fea107b90f9d194819161a2728783de"} Apr 21 14:58:17.216731 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.216713 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 14:58:17.235142 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.234989 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" podStartSLOduration=33.234972225 podStartE2EDuration="33.234972225s" podCreationTimestamp="2026-04-21 14:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:58:17.234295644 +0000 UTC m=+161.218187718" watchObservedRunningTime="2026-04-21 14:58:17.234972225 +0000 UTC m=+161.218864274" Apr 21 14:58:17.769570 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.769529 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:58:17.772371 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.772348 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8033b471-ca39-425f-9cbb-cf56b370a5a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c5gp2\" (UID: \"8033b471-ca39-425f-9cbb-cf56b370a5a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:58:17.870538 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.870500 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:58:17.870756 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.870560 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:58:17.870756 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.870616 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:58:17.873597 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.873532 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"image-registry-67b4567bdb-m7dt5\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:58:17.873750 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.873721 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17ba6101-b1f6-412d-b361-2276f610226b-metrics-tls\") pod \"dns-default-mhp6p\" (UID: \"17ba6101-b1f6-412d-b361-2276f610226b\") " pod="openshift-dns/dns-default-mhp6p" Apr 21 14:58:17.873868 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:17.873802 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ebea87-6126-42fa-bcf7-027f7fbce419-cert\") pod \"ingress-canary-6vmnw\" (UID: \"40ebea87-6126-42fa-bcf7-027f7fbce419\") " pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:58:18.007765 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.007730 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-jbj88\"" Apr 21 14:58:18.007944 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.007790 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-47f52\"" Apr 21 14:58:18.007944 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.007730 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7xfsn\"" Apr 21 14:58:18.011870 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.011827 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" Apr 21 14:58:18.011870 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.011865 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:58:18.012035 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.011883 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6vmnw" Apr 21 14:58:18.177260 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.177193 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:18.182044 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.182014 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:18.220706 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.220661 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:18.223227 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.223162 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-76d7d6f776-nvlj4" Apr 21 14:58:18.284988 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.284959 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67b4567bdb-m7dt5"] Apr 21 14:58:18.293748 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:18.293709 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ac89af4_5925_4a52_a694_31a92b841ed6.slice/crio-25ac7db99e1e7457666435b9cca9cfdb8d578cceeafcf33a6f8810022aa40715 WatchSource:0}: Error finding container 25ac7db99e1e7457666435b9cca9cfdb8d578cceeafcf33a6f8810022aa40715: Status 404 returned error can't find the container with id 25ac7db99e1e7457666435b9cca9cfdb8d578cceeafcf33a6f8810022aa40715 Apr 21 14:58:18.504105 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.504072 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6vmnw"] Apr 21 14:58:18.507020 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:18.506994 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2"] Apr 21 14:58:18.508019 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:18.507980 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40ebea87_6126_42fa_bcf7_027f7fbce419.slice/crio-8e6b2d218d2baa8735472aa17fd0854f6d08e6c7e4e2716f76f6742f8d44ca8c WatchSource:0}: Error finding container 8e6b2d218d2baa8735472aa17fd0854f6d08e6c7e4e2716f76f6742f8d44ca8c: Status 404 returned error can't find the container with id 8e6b2d218d2baa8735472aa17fd0854f6d08e6c7e4e2716f76f6742f8d44ca8c Apr 21 14:58:18.511863 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:18.511837 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8033b471_ca39_425f_9cbb_cf56b370a5a2.slice/crio-ed9997160ee7609ab6fe1e3b7577e0e461dfda6512d748dc14e7d4c2fefaf83a WatchSource:0}: Error finding container ed9997160ee7609ab6fe1e3b7577e0e461dfda6512d748dc14e7d4c2fefaf83a: Status 404 returned error can't find the container with id ed9997160ee7609ab6fe1e3b7577e0e461dfda6512d748dc14e7d4c2fefaf83a Apr 21 14:58:19.225174 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:19.225133 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" event={"ID":"b4238647-950c-4b0e-ac27-6e6a0040c6dc","Type":"ContainerStarted","Data":"ebadac26db6a52842544d842fb58ecbdd69e07f5862238fc6197a3394e7516c5"} Apr 21 14:58:19.226907 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:19.226864 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6vmnw" event={"ID":"40ebea87-6126-42fa-bcf7-027f7fbce419","Type":"ContainerStarted","Data":"8e6b2d218d2baa8735472aa17fd0854f6d08e6c7e4e2716f76f6742f8d44ca8c"} Apr 21 14:58:19.228156 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:19.228115 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" event={"ID":"8033b471-ca39-425f-9cbb-cf56b370a5a2","Type":"ContainerStarted","Data":"ed9997160ee7609ab6fe1e3b7577e0e461dfda6512d748dc14e7d4c2fefaf83a"} Apr 21 14:58:19.229929 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:19.229897 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" event={"ID":"5ac89af4-5925-4a52-a694-31a92b841ed6","Type":"ContainerStarted","Data":"db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed"} Apr 21 14:58:19.230022 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:19.229934 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" event={"ID":"5ac89af4-5925-4a52-a694-31a92b841ed6","Type":"ContainerStarted","Data":"25ac7db99e1e7457666435b9cca9cfdb8d578cceeafcf33a6f8810022aa40715"} Apr 21 14:58:19.241800 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:19.241623 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s2zz4" podStartSLOduration=33.477052231 podStartE2EDuration="35.241608122s" podCreationTimestamp="2026-04-21 14:57:44 +0000 UTC" firstStartedPulling="2026-04-21 14:58:16.424787602 +0000 UTC m=+160.408679628" lastFinishedPulling="2026-04-21 14:58:18.189343491 +0000 UTC m=+162.173235519" observedRunningTime="2026-04-21 14:58:19.240259682 +0000 UTC m=+163.224151731" watchObservedRunningTime="2026-04-21 14:58:19.241608122 +0000 UTC m=+163.225500171" Apr 21 14:58:19.262472 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:19.262368 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" podStartSLOduration=162.262349456 podStartE2EDuration="2m42.262349456s" podCreationTimestamp="2026-04-21 14:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:58:19.261320084 +0000 UTC m=+163.245212133" watchObservedRunningTime="2026-04-21 14:58:19.262349456 +0000 UTC m=+163.246241482" Apr 21 14:58:20.233253 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:20.233217 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" event={"ID":"8033b471-ca39-425f-9cbb-cf56b370a5a2","Type":"ContainerStarted","Data":"ec961e7ac7a3d6fdd81f14d90010b069c2c74f61c73e149220765245d7abfe58"} Apr 21 14:58:20.233732 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:20.233709 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:58:20.246724 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:20.246686 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c5gp2" podStartSLOduration=162.357030115 podStartE2EDuration="2m43.246672077s" podCreationTimestamp="2026-04-21 14:55:37 +0000 UTC" firstStartedPulling="2026-04-21 14:58:18.527659584 +0000 UTC m=+162.511551611" lastFinishedPulling="2026-04-21 14:58:19.417301543 +0000 UTC m=+163.401193573" observedRunningTime="2026-04-21 14:58:20.246067834 +0000 UTC m=+164.229959872" watchObservedRunningTime="2026-04-21 14:58:20.246672077 +0000 UTC m=+164.230564118" Apr 21 14:58:21.237123 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:21.237090 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6vmnw" event={"ID":"40ebea87-6126-42fa-bcf7-027f7fbce419","Type":"ContainerStarted","Data":"fa8c4091879d0fdf6c789092ba4aa0af40692818c8a4c7f84b3a50bd4340a373"} Apr 21 14:58:21.251357 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:21.251208 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6vmnw" podStartSLOduration=130.58829607 podStartE2EDuration="2m12.251193605s" podCreationTimestamp="2026-04-21 14:56:09 +0000 UTC" firstStartedPulling="2026-04-21 14:58:18.510178659 +0000 UTC m=+162.494070685" lastFinishedPulling="2026-04-21 14:58:20.173076191 +0000 UTC m=+164.156968220" observedRunningTime="2026-04-21 14:58:21.250720416 +0000 UTC m=+165.234612463" watchObservedRunningTime="2026-04-21 14:58:21.251193605 +0000 UTC m=+165.235085647" Apr 21 14:58:24.143846 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:24.143795 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:58:24.143846 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:24.143852 2610 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:58:24.144267 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:24.144197 2610 scope.go:117] "RemoveContainer" containerID="3b66ad60f2cf263007cef300f868f5d7edaa69c093f970d1c0ffa446e9f49d6a" Apr 21 14:58:24.144375 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:24.144358 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2k25j_openshift-console-operator(d982beb0-1451-48ab-b61a-060b6d23cfc7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" podUID="d982beb0-1451-48ab-b61a-060b6d23cfc7" Apr 21 14:58:25.621170 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:25.621092 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:58:28.621205 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:28.621163 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mhp6p" Apr 21 14:58:28.623600 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:28.623563 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5sqkw\"" Apr 21 14:58:28.632434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:28.632414 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mhp6p" Apr 21 14:58:28.751370 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:28.751337 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mhp6p"] Apr 21 14:58:28.755985 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:28.755960 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17ba6101_b1f6_412d_b361_2276f610226b.slice/crio-e4be2f39bb3b16f78bf61d5f6338921da749f779601519259d60ff2384c06adb WatchSource:0}: Error finding container e4be2f39bb3b16f78bf61d5f6338921da749f779601519259d60ff2384c06adb: Status 404 returned error can't find the container with id e4be2f39bb3b16f78bf61d5f6338921da749f779601519259d60ff2384c06adb Apr 21 14:58:29.257417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:29.257381 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhp6p" event={"ID":"17ba6101-b1f6-412d-b361-2276f610226b","Type":"ContainerStarted","Data":"e4be2f39bb3b16f78bf61d5f6338921da749f779601519259d60ff2384c06adb"} Apr 21 14:58:30.156402 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.154271 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67"] Apr 21 14:58:30.160208 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.160183 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.162566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.162547 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 14:58:30.162748 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.162609 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2x665\"" Apr 21 14:58:30.162916 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.162646 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 14:58:30.163072 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.162693 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 14:58:30.167245 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.167063 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67"] Apr 21 14:58:30.177129 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.177102 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zcpxh"] Apr 21 14:58:30.180552 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.180530 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.186526 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.186504 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 14:58:30.186672 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.186553 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 14:58:30.186672 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.186567 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 14:58:30.186753 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.186698 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-v8bhv\"" Apr 21 14:58:30.260989 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.260914 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhp6p" event={"ID":"17ba6101-b1f6-412d-b361-2276f610226b","Type":"ContainerStarted","Data":"19b94c54aa7ea87d63cafffbe13d65ec6768cccfa1eaf0daffc783fb986961b0"} Apr 21 14:58:30.260989 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.260946 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhp6p" event={"ID":"17ba6101-b1f6-412d-b361-2276f610226b","Type":"ContainerStarted","Data":"b55038ecca638ca0defee44f0c89cfc28c4697b2d52bb72d5b84bee5dcfc330c"} Apr 21 14:58:30.261155 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.261069 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mhp6p" Apr 21 14:58:30.268630 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268602 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.268771 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268643 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-wtmp\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.268771 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268670 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-root\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.268771 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268700 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-accelerators-collector-config\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.268771 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268725 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-textfile\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.268771 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268756 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bf5j\" (UniqueName: \"kubernetes.io/projected/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-kube-api-access-8bf5j\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.269016 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268793 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-sys\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.269016 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268819 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.269016 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268874 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.269016 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268893 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b20462c-711e-48e3-bfc6-772375505353-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.269016 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268920 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjmd\" (UniqueName: \"kubernetes.io/projected/5b20462c-711e-48e3-bfc6-772375505353-kube-api-access-dxjmd\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.269016 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268945 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-tls\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.269016 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.268959 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-metrics-client-ca\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.280080 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.280029 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mhp6p" podStartSLOduration=140.025727885 podStartE2EDuration="2m21.280015662s" podCreationTimestamp="2026-04-21 14:56:09 +0000 UTC" firstStartedPulling="2026-04-21 14:58:28.758279153 +0000 UTC m=+172.742171194" lastFinishedPulling="2026-04-21 14:58:30.012566946 +0000 UTC m=+173.996458971" observedRunningTime="2026-04-21 14:58:30.278708058 +0000 UTC m=+174.262600103" watchObservedRunningTime="2026-04-21 14:58:30.280015662 +0000 UTC m=+174.263907727" Apr 21 14:58:30.370218 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370175 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.370410 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370233 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b20462c-711e-48e3-bfc6-772375505353-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.370410 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370257 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjmd\" (UniqueName: \"kubernetes.io/projected/5b20462c-711e-48e3-bfc6-772375505353-kube-api-access-dxjmd\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.370410 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370282 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-tls\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370410 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370298 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-metrics-client-ca\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370410 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:30.370309 2610 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 14:58:30.370410 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370329 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370410 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370347 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-wtmp\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370410 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:30.370404 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-tls podName:5b20462c-711e-48e3-bfc6-772375505353 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:30.870382236 +0000 UTC m=+174.854274281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-lmv67" (UID: "5b20462c-711e-48e3-bfc6-772375505353") : secret "openshift-state-metrics-tls" not found Apr 21 14:58:30.370895 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370483 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-wtmp\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370895 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370516 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-root\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370895 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370568 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-accelerators-collector-config\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370895 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370639 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-textfile\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370895 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370682 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bf5j\" (UniqueName: \"kubernetes.io/projected/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-kube-api-access-8bf5j\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370895 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370729 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-sys\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.370895 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370766 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.370895 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370873 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-sys\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.371286 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.370994 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-metrics-client-ca\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.371286 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.371038 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b20462c-711e-48e3-bfc6-772375505353-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.371286 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.371054 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-root\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.371286 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.371249 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-accelerators-collector-config\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.371286 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.371278 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-textfile\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.373070 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.373045 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-tls\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.373300 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.373269 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.373347 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.373269 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.381204 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.381181 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bf5j\" (UniqueName: \"kubernetes.io/projected/6e950fdc-c8a9-4a4e-ac1e-c78d8747299f-kube-api-access-8bf5j\") pod \"node-exporter-zcpxh\" (UID: \"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f\") " pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.381371 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.381349 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjmd\" (UniqueName: \"kubernetes.io/projected/5b20462c-711e-48e3-bfc6-772375505353-kube-api-access-dxjmd\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.493592 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.493552 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zcpxh" Apr 21 14:58:30.501614 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:30.501564 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e950fdc_c8a9_4a4e_ac1e_c78d8747299f.slice/crio-b69bdad89ff5fb1dcba5c7b158f2d4f3708162f632b6982e333de181a2842a6a WatchSource:0}: Error finding container b69bdad89ff5fb1dcba5c7b158f2d4f3708162f632b6982e333de181a2842a6a: Status 404 returned error can't find the container with id b69bdad89ff5fb1dcba5c7b158f2d4f3708162f632b6982e333de181a2842a6a Apr 21 14:58:30.874846 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.874813 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:30.877452 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:30.877429 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b20462c-711e-48e3-bfc6-772375505353-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lmv67\" (UID: \"5b20462c-711e-48e3-bfc6-772375505353\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:31.074209 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:31.074181 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" Apr 21 14:58:31.217071 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:31.217026 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67"] Apr 21 14:58:31.265857 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:31.265815 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcpxh" event={"ID":"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f","Type":"ContainerStarted","Data":"b69bdad89ff5fb1dcba5c7b158f2d4f3708162f632b6982e333de181a2842a6a"} Apr 21 14:58:31.290038 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:31.290007 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b20462c_711e_48e3_bfc6_772375505353.slice/crio-e13aa28ff57c71b9cb92f0e4c98fc867c907be9b7f1ca0f362f6c66bbb473996 WatchSource:0}: Error finding container e13aa28ff57c71b9cb92f0e4c98fc867c907be9b7f1ca0f362f6c66bbb473996: Status 404 returned error can't find the container with id e13aa28ff57c71b9cb92f0e4c98fc867c907be9b7f1ca0f362f6c66bbb473996 Apr 21 14:58:32.269518 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:32.269490 2610 generic.go:358] "Generic (PLEG): container finished" podID="6e950fdc-c8a9-4a4e-ac1e-c78d8747299f" containerID="30e76da7ad7b2920c7f5e81d68a15986133fb818ae1701a75c19aae1e769b95f" exitCode=0 Apr 21 14:58:32.269901 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:32.269595 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcpxh" event={"ID":"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f","Type":"ContainerDied","Data":"30e76da7ad7b2920c7f5e81d68a15986133fb818ae1701a75c19aae1e769b95f"} Apr 21 14:58:32.271320 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:32.271292 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" event={"ID":"5b20462c-711e-48e3-bfc6-772375505353","Type":"ContainerStarted","Data":"edeb03c2c8dc5a8f9d61e26e4d220fa6818a56d61600306b1a55d87528031c04"} Apr 21 14:58:32.271320 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:32.271318 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" event={"ID":"5b20462c-711e-48e3-bfc6-772375505353","Type":"ContainerStarted","Data":"3b890f8a343925882b38bfa9e16b06d9e479173141015081613f45665937d273"} Apr 21 14:58:32.271486 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:32.271330 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" event={"ID":"5b20462c-711e-48e3-bfc6-772375505353","Type":"ContainerStarted","Data":"e13aa28ff57c71b9cb92f0e4c98fc867c907be9b7f1ca0f362f6c66bbb473996"} Apr 21 14:58:33.276000 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:33.275955 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcpxh" event={"ID":"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f","Type":"ContainerStarted","Data":"08e6289ee91a4915d49d95bbf7645febee903156e6bf10437b5fafccad4fac9e"} Apr 21 14:58:33.276000 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:33.275997 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcpxh" event={"ID":"6e950fdc-c8a9-4a4e-ac1e-c78d8747299f","Type":"ContainerStarted","Data":"44294d4c1b7a4e9db4f6ddaa7f03c47bd2e2fc623044e5b2c39f878e022436fb"} Apr 21 14:58:33.277777 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:33.277752 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" event={"ID":"5b20462c-711e-48e3-bfc6-772375505353","Type":"ContainerStarted","Data":"dff82cd05684844a8062534705b100b4d1a799351785b55cf5f3598ccdfccf5d"} Apr 21 14:58:33.293158 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:33.293113 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zcpxh" podStartSLOduration=2.460698545 podStartE2EDuration="3.29310006s" podCreationTimestamp="2026-04-21 14:58:30 +0000 UTC" firstStartedPulling="2026-04-21 14:58:30.503272075 +0000 UTC m=+174.487164102" lastFinishedPulling="2026-04-21 14:58:31.335673579 +0000 UTC m=+175.319565617" observedRunningTime="2026-04-21 14:58:33.291527866 +0000 UTC m=+177.275419916" watchObservedRunningTime="2026-04-21 14:58:33.29310006 +0000 UTC m=+177.276992109" Apr 21 14:58:33.308311 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:33.308260 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lmv67" podStartSLOduration=2.461138873 podStartE2EDuration="3.308245042s" podCreationTimestamp="2026-04-21 14:58:30 +0000 UTC" firstStartedPulling="2026-04-21 14:58:31.410009007 +0000 UTC m=+175.393901032" lastFinishedPulling="2026-04-21 14:58:32.257115174 +0000 UTC m=+176.241007201" observedRunningTime="2026-04-21 14:58:33.307539945 +0000 UTC m=+177.291431994" watchObservedRunningTime="2026-04-21 14:58:33.308245042 +0000 UTC m=+177.292137090" Apr 21 14:58:34.493080 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.493044 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-59d6dcb677-9wdsz"] Apr 21 14:58:34.496185 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.496169 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.498542 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.498513 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-c81ihjmta7d6\"" Apr 21 14:58:34.498676 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.498546 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 14:58:34.498676 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.498600 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 14:58:34.498676 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.498625 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-q6hfq\"" Apr 21 14:58:34.498676 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.498625 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 14:58:34.498930 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.498915 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 14:58:34.515088 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.515067 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59d6dcb677-9wdsz"] Apr 21 14:58:34.605302 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.605272 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-secret-metrics-server-tls\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.605454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.605317 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/10c7a117-9bb1-4269-bcc2-28891c48c4e1-audit-log\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.605454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.605338 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldcvt\" (UniqueName: \"kubernetes.io/projected/10c7a117-9bb1-4269-bcc2-28891c48c4e1-kube-api-access-ldcvt\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.605454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.605406 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10c7a117-9bb1-4269-bcc2-28891c48c4e1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.605454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.605440 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/10c7a117-9bb1-4269-bcc2-28891c48c4e1-metrics-server-audit-profiles\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.605624 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.605463 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-secret-metrics-server-client-certs\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.605624 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.605553 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-client-ca-bundle\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.706254 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.706218 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-secret-metrics-server-tls\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.706434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.706262 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/10c7a117-9bb1-4269-bcc2-28891c48c4e1-audit-log\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.706434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.706282 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldcvt\" (UniqueName: \"kubernetes.io/projected/10c7a117-9bb1-4269-bcc2-28891c48c4e1-kube-api-access-ldcvt\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.706434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.706305 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10c7a117-9bb1-4269-bcc2-28891c48c4e1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.706434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.706323 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/10c7a117-9bb1-4269-bcc2-28891c48c4e1-metrics-server-audit-profiles\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.706434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.706348 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-secret-metrics-server-client-certs\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.706434 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.706405 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-client-ca-bundle\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.706751 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.706727 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/10c7a117-9bb1-4269-bcc2-28891c48c4e1-audit-log\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.707409 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.707382 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10c7a117-9bb1-4269-bcc2-28891c48c4e1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.707524 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.707459 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/10c7a117-9bb1-4269-bcc2-28891c48c4e1-metrics-server-audit-profiles\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.709082 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.709053 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-secret-metrics-server-client-certs\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.709082 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.709062 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-client-ca-bundle\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.709198 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.709097 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/10c7a117-9bb1-4269-bcc2-28891c48c4e1-secret-metrics-server-tls\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.713884 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.713865 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldcvt\" (UniqueName: \"kubernetes.io/projected/10c7a117-9bb1-4269-bcc2-28891c48c4e1-kube-api-access-ldcvt\") pod \"metrics-server-59d6dcb677-9wdsz\" (UID: \"10c7a117-9bb1-4269-bcc2-28891c48c4e1\") " pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.804874 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.804787 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:34.950260 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:34.950231 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59d6dcb677-9wdsz"] Apr 21 14:58:34.953956 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:34.953925 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c7a117_9bb1_4269_bcc2_28891c48c4e1.slice/crio-b18174abb846e3e89a9a1ba4c8ceaad937ccdc66ad1dfee6330c03c4b035cd8a WatchSource:0}: Error finding container b18174abb846e3e89a9a1ba4c8ceaad937ccdc66ad1dfee6330c03c4b035cd8a: Status 404 returned error can't find the container with id b18174abb846e3e89a9a1ba4c8ceaad937ccdc66ad1dfee6330c03c4b035cd8a Apr 21 14:58:35.088619 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.088527 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz"] Apr 21 14:58:35.092912 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.092895 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" Apr 21 14:58:35.095206 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.095186 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 14:58:35.095325 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.095214 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-n8dmk\"" Apr 21 14:58:35.100677 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.100651 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz"] Apr 21 14:58:35.209739 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.209708 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/435ebbea-4b21-4fa3-883b-ef3b3e18723e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-m7zgz\" (UID: \"435ebbea-4b21-4fa3-883b-ef3b3e18723e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" Apr 21 14:58:35.284463 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.284433 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" event={"ID":"10c7a117-9bb1-4269-bcc2-28891c48c4e1","Type":"ContainerStarted","Data":"b18174abb846e3e89a9a1ba4c8ceaad937ccdc66ad1dfee6330c03c4b035cd8a"} Apr 21 14:58:35.311033 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.311005 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/435ebbea-4b21-4fa3-883b-ef3b3e18723e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-m7zgz\" (UID: \"435ebbea-4b21-4fa3-883b-ef3b3e18723e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" Apr 21 14:58:35.311130 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:35.311113 2610 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 14:58:35.311176 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:58:35.311161 2610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435ebbea-4b21-4fa3-883b-ef3b3e18723e-monitoring-plugin-cert podName:435ebbea-4b21-4fa3-883b-ef3b3e18723e nodeName:}" failed. No retries permitted until 2026-04-21 14:58:35.811149337 +0000 UTC m=+179.795041363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/435ebbea-4b21-4fa3-883b-ef3b3e18723e-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-m7zgz" (UID: "435ebbea-4b21-4fa3-883b-ef3b3e18723e") : secret "monitoring-plugin-cert" not found Apr 21 14:58:35.815826 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.815783 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/435ebbea-4b21-4fa3-883b-ef3b3e18723e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-m7zgz\" (UID: \"435ebbea-4b21-4fa3-883b-ef3b3e18723e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" Apr 21 14:58:35.818733 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:35.818705 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/435ebbea-4b21-4fa3-883b-ef3b3e18723e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-m7zgz\" (UID: \"435ebbea-4b21-4fa3-883b-ef3b3e18723e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" Apr 21 14:58:36.002608 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.002541 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" Apr 21 14:58:36.354362 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.354336 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz"] Apr 21 14:58:36.359033 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:36.359009 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod435ebbea_4b21_4fa3_883b_ef3b3e18723e.slice/crio-f170362467f0ad4db29a1df787f6f07ad81516b876f72bfdb65d34cf16e9e18c WatchSource:0}: Error finding container f170362467f0ad4db29a1df787f6f07ad81516b876f72bfdb65d34cf16e9e18c: Status 404 returned error can't find the container with id f170362467f0ad4db29a1df787f6f07ad81516b876f72bfdb65d34cf16e9e18c Apr 21 14:58:36.509063 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.509031 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:58:36.514607 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.514567 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.516846 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.516824 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 14:58:36.516953 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.516843 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 14:58:36.516953 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.516937 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 14:58:36.517111 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.516985 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-l5jw5\"" Apr 21 14:58:36.517111 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.517049 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 14:58:36.517111 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.517103 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 14:58:36.517367 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.517350 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 14:58:36.517464 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.517353 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 14:58:36.517464 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.517449 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f3ddpi19md28c\"" Apr 21 14:58:36.517617 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.517597 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 14:58:36.517680 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.517654 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 14:58:36.519251 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.519232 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 14:58:36.521328 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.521311 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 14:58:36.523236 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.523216 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 14:58:36.525791 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.525770 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:58:36.622368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622339 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-config\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622535 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622373 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622535 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622400 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622535 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622467 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622535 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622503 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7v5\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-kube-api-access-rp7v5\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622780 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622548 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622780 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622647 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622780 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622683 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622780 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622708 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622780 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622739 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.622780 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622771 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.623050 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622791 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.623050 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622816 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.623050 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622836 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.623050 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622857 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.623050 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622875 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-config-out\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.623050 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622889 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.623050 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.622918 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-web-config\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.723836 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.723797 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.723836 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.723834 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7v5\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-kube-api-access-rp7v5\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724048 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.723860 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724048 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.723919 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724048 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.723948 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724048 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.723973 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724048 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724000 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724048 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724030 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724056 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724087 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724115 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724142 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724167 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-config-out\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724191 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724231 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-web-config\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724260 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-config\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724284 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.724368 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724309 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.725207 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.724881 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.725207 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.725035 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.725207 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.725195 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.728925 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.728903 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.729254 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.729234 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.729366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.729317 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-config\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.729954 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.729634 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.729954 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.729842 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.730175 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.730155 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-web-config\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.730247 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.730191 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.730343 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.730318 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.730681 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.730650 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.730922 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.730893 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-config-out\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.731001 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.730979 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.731162 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.731139 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.731433 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.731416 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.731590 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.731561 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.732327 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.732310 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7v5\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-kube-api-access-rp7v5\") pod \"prometheus-k8s-0\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.828190 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.827696 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:36.978543 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:36.978510 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:58:36.981181 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:58:36.981159 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod484f360a_e0dd_419c_bfbb_758bbe78f68e.slice/crio-761992a80599b99706106b8c138e17718233c3b804c2810bc4429cd544580513 WatchSource:0}: Error finding container 761992a80599b99706106b8c138e17718233c3b804c2810bc4429cd544580513: Status 404 returned error can't find the container with id 761992a80599b99706106b8c138e17718233c3b804c2810bc4429cd544580513 Apr 21 14:58:37.292093 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:37.292053 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" event={"ID":"10c7a117-9bb1-4269-bcc2-28891c48c4e1","Type":"ContainerStarted","Data":"bfeefdf26dc78b55ea11ee6f3e9f966b3408b22aafd6f785eebdb4a6734389f8"} Apr 21 14:58:37.293210 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:37.293180 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerStarted","Data":"761992a80599b99706106b8c138e17718233c3b804c2810bc4429cd544580513"} Apr 21 14:58:37.294233 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:37.294212 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" event={"ID":"435ebbea-4b21-4fa3-883b-ef3b3e18723e","Type":"ContainerStarted","Data":"f170362467f0ad4db29a1df787f6f07ad81516b876f72bfdb65d34cf16e9e18c"} Apr 21 14:58:37.308991 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:37.308938 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" podStartSLOduration=1.9803392039999999 podStartE2EDuration="3.308925669s" podCreationTimestamp="2026-04-21 14:58:34 +0000 UTC" firstStartedPulling="2026-04-21 14:58:34.955853867 +0000 UTC m=+178.939745892" lastFinishedPulling="2026-04-21 14:58:36.284440316 +0000 UTC m=+180.268332357" observedRunningTime="2026-04-21 14:58:37.308751315 +0000 UTC m=+181.292643364" watchObservedRunningTime="2026-04-21 14:58:37.308925669 +0000 UTC m=+181.292817717" Apr 21 14:58:37.621634 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:37.621535 2610 scope.go:117] "RemoveContainer" containerID="3b66ad60f2cf263007cef300f868f5d7edaa69c093f970d1c0ffa446e9f49d6a" Apr 21 14:58:38.017202 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.017162 2610 patch_prober.go:28] interesting pod/image-registry-67b4567bdb-m7dt5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 14:58:38.017493 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.017227 2610 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 14:58:38.299188 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.299093 2610 generic.go:358] "Generic (PLEG): container finished" podID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" exitCode=0 Apr 21 14:58:38.299343 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.299183 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerDied","Data":"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e"} Apr 21 14:58:38.300971 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.300954 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 14:58:38.301075 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.301058 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" event={"ID":"d982beb0-1451-48ab-b61a-060b6d23cfc7","Type":"ContainerStarted","Data":"9b95e5ae7ba17d967b642369ba865ff19ae643ff4a760e6c2bd1421a81eba137"} Apr 21 14:58:38.301369 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.301345 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:58:38.302691 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.302623 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" event={"ID":"435ebbea-4b21-4fa3-883b-ef3b3e18723e","Type":"ContainerStarted","Data":"512f72161edf65aa9fa41ee999176e0c5a6aec61192e9f5352a412c09306585a"} Apr 21 14:58:38.303134 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.303005 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" Apr 21 14:58:38.307548 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.307531 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" Apr 21 14:58:38.347221 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.342761 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" podStartSLOduration=43.306878094 podStartE2EDuration="45.342736677s" podCreationTimestamp="2026-04-21 14:57:53 +0000 UTC" firstStartedPulling="2026-04-21 14:57:54.264034207 +0000 UTC m=+138.247926234" lastFinishedPulling="2026-04-21 14:57:56.299892777 +0000 UTC m=+140.283784817" observedRunningTime="2026-04-21 14:58:38.34244096 +0000 UTC m=+182.326333009" watchObservedRunningTime="2026-04-21 14:58:38.342736677 +0000 UTC m=+182.326628726" Apr 21 14:58:38.360112 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.360059 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-m7zgz" podStartSLOduration=1.650314595 podStartE2EDuration="3.360044843s" podCreationTimestamp="2026-04-21 14:58:35 +0000 UTC" firstStartedPulling="2026-04-21 14:58:36.361281717 +0000 UTC m=+180.345173751" lastFinishedPulling="2026-04-21 14:58:38.071011968 +0000 UTC m=+182.054903999" observedRunningTime="2026-04-21 14:58:38.358479588 +0000 UTC m=+182.342371636" watchObservedRunningTime="2026-04-21 14:58:38.360044843 +0000 UTC m=+182.343936892" Apr 21 14:58:38.826975 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:38.826945 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2k25j" Apr 21 14:58:40.268147 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:40.268112 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mhp6p" Apr 21 14:58:41.241941 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:41.241902 2610 patch_prober.go:28] interesting pod/image-registry-67b4567bdb-m7dt5 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 14:58:41.242124 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:41.241959 2610 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 14:58:42.315175 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:42.315145 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerStarted","Data":"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee"} Apr 21 14:58:42.315175 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:42.315176 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerStarted","Data":"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286"} Apr 21 14:58:43.268362 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:43.268330 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67b4567bdb-m7dt5"] Apr 21 14:58:43.273990 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:43.273945 2610 patch_prober.go:28] interesting pod/image-registry-67b4567bdb-m7dt5 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 14:58:43.274135 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:43.274013 2610 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 14:58:44.323102 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:44.323061 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerStarted","Data":"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5"} Apr 21 14:58:44.323102 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:44.323106 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerStarted","Data":"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80"} Apr 21 14:58:44.323495 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:44.323122 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerStarted","Data":"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3"} Apr 21 14:58:44.323495 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:44.323135 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerStarted","Data":"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e"} Apr 21 14:58:44.352252 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:44.352205 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.457127512 podStartE2EDuration="8.352193941s" podCreationTimestamp="2026-04-21 14:58:36 +0000 UTC" firstStartedPulling="2026-04-21 14:58:36.983209749 +0000 UTC m=+180.967101775" lastFinishedPulling="2026-04-21 14:58:43.878276174 +0000 UTC m=+187.862168204" observedRunningTime="2026-04-21 14:58:44.350830532 +0000 UTC m=+188.334722580" watchObservedRunningTime="2026-04-21 14:58:44.352193941 +0000 UTC m=+188.336085966" Apr 21 14:58:46.828824 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:46.828784 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:58:52.346310 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:52.346273 2610 generic.go:358] "Generic (PLEG): container finished" podID="e3d66dda-7e69-48a9-a23b-ca9cdad31f2b" containerID="c2167a2d5db0f95ca0813b845cc364107b930f190179165ec93e18e426f647bb" exitCode=0 Apr 21 14:58:52.346746 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:52.346333 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-fnrh5" event={"ID":"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b","Type":"ContainerDied","Data":"c2167a2d5db0f95ca0813b845cc364107b930f190179165ec93e18e426f647bb"} Apr 21 14:58:52.346746 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:52.346681 2610 scope.go:117] "RemoveContainer" containerID="c2167a2d5db0f95ca0813b845cc364107b930f190179165ec93e18e426f647bb" Apr 21 14:58:53.273545 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:53.273517 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:58:53.351245 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:53.351213 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-fnrh5" event={"ID":"e3d66dda-7e69-48a9-a23b-ca9cdad31f2b","Type":"ContainerStarted","Data":"fef779519379f3de2b840d5691ddd76fa3d1bc0f5531f7fca95dfc7d44ee4628"} Apr 21 14:58:53.453516 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:53.453487 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mhp6p_17ba6101-b1f6-412d-b361-2276f610226b/dns/0.log" Apr 21 14:58:53.652366 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:53.652290 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mhp6p_17ba6101-b1f6-412d-b361-2276f610226b/kube-rbac-proxy/0.log" Apr 21 14:58:54.805837 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:54.805786 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:54.805837 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:54.805822 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:58:55.051810 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:55.051783 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n8qc9_b48f3832-4ecd-46ba-bde8-35a4180bf3ca/dns-node-resolver/0.log" Apr 21 14:58:55.252560 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:55.252533 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76d7d6f776-nvlj4_e22e5723-18d9-4194-867b-028f5e78e14d/router/0.log" Apr 21 14:58:55.452809 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:58:55.452782 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6vmnw_40ebea87-6126-42fa-bcf7-027f7fbce419/serve-healthcheck-canary/0.log" Apr 21 14:59:08.290812 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.290745 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" containerName="registry" containerID="cri-o://db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed" gracePeriod=30 Apr 21 14:59:08.538661 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.538639 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:59:08.715660 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.715530 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ac89af4-5925-4a52-a694-31a92b841ed6-ca-trust-extracted\") pod \"5ac89af4-5925-4a52-a694-31a92b841ed6\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " Apr 21 14:59:08.715660 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.715620 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-bound-sa-token\") pod \"5ac89af4-5925-4a52-a694-31a92b841ed6\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " Apr 21 14:59:08.715880 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.715682 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-certificates\") pod \"5ac89af4-5925-4a52-a694-31a92b841ed6\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " Apr 21 14:59:08.715880 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.715708 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4hvr\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-kube-api-access-v4hvr\") pod \"5ac89af4-5925-4a52-a694-31a92b841ed6\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " Apr 21 14:59:08.715880 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.715741 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-trusted-ca\") pod \"5ac89af4-5925-4a52-a694-31a92b841ed6\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " Apr 21 14:59:08.715880 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.715774 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-installation-pull-secrets\") pod \"5ac89af4-5925-4a52-a694-31a92b841ed6\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " Apr 21 14:59:08.715880 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.715824 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-image-registry-private-configuration\") pod \"5ac89af4-5925-4a52-a694-31a92b841ed6\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " Apr 21 14:59:08.715880 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.715850 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") pod \"5ac89af4-5925-4a52-a694-31a92b841ed6\" (UID: \"5ac89af4-5925-4a52-a694-31a92b841ed6\") " Apr 21 14:59:08.716347 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.716265 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5ac89af4-5925-4a52-a694-31a92b841ed6" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:08.716347 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.716273 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5ac89af4-5925-4a52-a694-31a92b841ed6" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:08.718416 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.718382 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5ac89af4-5925-4a52-a694-31a92b841ed6" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:08.718523 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.718417 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-kube-api-access-v4hvr" (OuterVolumeSpecName: "kube-api-access-v4hvr") pod "5ac89af4-5925-4a52-a694-31a92b841ed6" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6"). InnerVolumeSpecName "kube-api-access-v4hvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:59:08.718655 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.718568 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5ac89af4-5925-4a52-a694-31a92b841ed6" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:59:08.718891 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.718864 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5ac89af4-5925-4a52-a694-31a92b841ed6" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:08.718891 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.718877 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5ac89af4-5925-4a52-a694-31a92b841ed6" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:59:08.724613 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.724562 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac89af4-5925-4a52-a694-31a92b841ed6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5ac89af4-5925-4a52-a694-31a92b841ed6" (UID: "5ac89af4-5925-4a52-a694-31a92b841ed6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:59:08.817071 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.817035 2610 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-image-registry-private-configuration\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:08.817071 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.817067 2610 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-tls\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:08.817071 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.817078 2610 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ac89af4-5925-4a52-a694-31a92b841ed6-ca-trust-extracted\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:08.817283 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.817087 2610 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-bound-sa-token\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:08.817283 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.817096 2610 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-registry-certificates\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:08.817283 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.817105 2610 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v4hvr\" (UniqueName: \"kubernetes.io/projected/5ac89af4-5925-4a52-a694-31a92b841ed6-kube-api-access-v4hvr\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:08.817283 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.817113 2610 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ac89af4-5925-4a52-a694-31a92b841ed6-trusted-ca\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:08.817283 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:08.817123 2610 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ac89af4-5925-4a52-a694-31a92b841ed6-installation-pull-secrets\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:09.397193 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.397152 2610 generic.go:358] "Generic (PLEG): container finished" podID="5ac89af4-5925-4a52-a694-31a92b841ed6" containerID="db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed" exitCode=0 Apr 21 14:59:09.397607 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.397218 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" Apr 21 14:59:09.397607 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.397219 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" event={"ID":"5ac89af4-5925-4a52-a694-31a92b841ed6","Type":"ContainerDied","Data":"db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed"} Apr 21 14:59:09.397607 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.397254 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67b4567bdb-m7dt5" event={"ID":"5ac89af4-5925-4a52-a694-31a92b841ed6","Type":"ContainerDied","Data":"25ac7db99e1e7457666435b9cca9cfdb8d578cceeafcf33a6f8810022aa40715"} Apr 21 14:59:09.397607 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.397272 2610 scope.go:117] "RemoveContainer" containerID="db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed" Apr 21 14:59:09.407709 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.407682 2610 scope.go:117] "RemoveContainer" containerID="db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed" Apr 21 14:59:09.407997 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:59:09.407976 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed\": container with ID starting with db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed not found: ID does not exist" containerID="db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed" Apr 21 14:59:09.408049 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.408006 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed"} err="failed to get container status \"db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed\": rpc error: code = NotFound desc = could not find container \"db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed\": container with ID starting with db83ee8f487fef3f2a7414f02660c0e125dce304e29e799dc0512666f9d0c4ed not found: ID does not exist" Apr 21 14:59:09.445963 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.445930 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67b4567bdb-m7dt5"] Apr 21 14:59:09.455531 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:09.455502 2610 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67b4567bdb-m7dt5"] Apr 21 14:59:10.624828 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:10.624783 2610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" path="/var/lib/kubelet/pods/5ac89af4-5925-4a52-a694-31a92b841ed6/volumes" Apr 21 14:59:14.810215 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:14.810184 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:59:14.814193 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:14.814168 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-59d6dcb677-9wdsz" Apr 21 14:59:36.829291 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:36.829198 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:36.845321 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:36.845293 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:37.490904 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:37.490874 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:47.427269 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:47.427234 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:59:47.429670 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:47.429639 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b064625-50f7-4c6a-be44-9aed34a00b26-metrics-certs\") pod \"network-metrics-daemon-mtdkf\" (UID: \"9b064625-50f7-4c6a-be44-9aed34a00b26\") " pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:59:47.524310 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:47.524282 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cb4hj\"" Apr 21 14:59:47.533019 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:47.532997 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtdkf" Apr 21 14:59:47.658157 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:47.658126 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mtdkf"] Apr 21 14:59:47.661404 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:59:47.661377 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b064625_50f7_4c6a_be44_9aed34a00b26.slice/crio-661eb072c9b6e57db0e90c576baa5b4c81d6334469a5fe067ce8b57954cfa2fc WatchSource:0}: Error finding container 661eb072c9b6e57db0e90c576baa5b4c81d6334469a5fe067ce8b57954cfa2fc: Status 404 returned error can't find the container with id 661eb072c9b6e57db0e90c576baa5b4c81d6334469a5fe067ce8b57954cfa2fc Apr 21 14:59:48.507064 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:48.507020 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mtdkf" event={"ID":"9b064625-50f7-4c6a-be44-9aed34a00b26","Type":"ContainerStarted","Data":"661eb072c9b6e57db0e90c576baa5b4c81d6334469a5fe067ce8b57954cfa2fc"} Apr 21 14:59:49.511286 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:49.511251 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mtdkf" event={"ID":"9b064625-50f7-4c6a-be44-9aed34a00b26","Type":"ContainerStarted","Data":"7af215d3b1a38628a01ecc92b848765127891a9ed25aec44ef0af6d9f06a0464"} Apr 21 14:59:49.511286 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:49.511290 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mtdkf" event={"ID":"9b064625-50f7-4c6a-be44-9aed34a00b26","Type":"ContainerStarted","Data":"08448aebf9ec1b5922d6ec4cd06f68df16ef7255113d3bd5ae9f289018629ff3"} Apr 21 14:59:49.528661 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:49.528605 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mtdkf" podStartSLOduration=252.622462653 podStartE2EDuration="4m13.528589083s" podCreationTimestamp="2026-04-21 14:55:36 +0000 UTC" firstStartedPulling="2026-04-21 14:59:47.663566922 +0000 UTC m=+251.647458948" lastFinishedPulling="2026-04-21 14:59:48.569693347 +0000 UTC m=+252.553585378" observedRunningTime="2026-04-21 14:59:49.528378225 +0000 UTC m=+253.512270274" watchObservedRunningTime="2026-04-21 14:59:49.528589083 +0000 UTC m=+253.512481122" Apr 21 14:59:54.989566 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:54.989478 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:59:54.990182 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:54.990144 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="prometheus" containerID="cri-o://40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" gracePeriod=600 Apr 21 14:59:54.990276 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:54.990172 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy" containerID="cri-o://c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" gracePeriod=600 Apr 21 14:59:54.990276 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:54.990201 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="thanos-sidecar" containerID="cri-o://7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" gracePeriod=600 Apr 21 14:59:54.990276 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:54.990248 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="config-reloader" containerID="cri-o://7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" gracePeriod=600 Apr 21 14:59:54.990428 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:54.990203 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy-web" containerID="cri-o://bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" gracePeriod=600 Apr 21 14:59:54.990428 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:54.990376 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy-thanos" containerID="cri-o://586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" gracePeriod=600 Apr 21 14:59:55.224768 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.224743 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.298202 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298165 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-kubelet-serving-ca-bundle\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298202 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298201 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-metrics-client-certs\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298418 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298230 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-rulefiles-0\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298418 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298255 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298418 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298282 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-tls-assets\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298418 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298305 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-metrics-client-ca\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298418 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298335 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-grpc-tls\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298418 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298360 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298418 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298404 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-serving-certs-ca-bundle\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298437 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-thanos-prometheus-http-client-file\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298463 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-config\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298492 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-db\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298521 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-tls\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298550 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp7v5\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-kube-api-access-rp7v5\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298593 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-config-out\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298628 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-web-config\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298676 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:55.298802 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298700 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-kube-rbac-proxy\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.299238 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.298858 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-trusted-ca-bundle\") pod \"484f360a-e0dd-419c-bfbb-758bbe78f68e\" (UID: \"484f360a-e0dd-419c-bfbb-758bbe78f68e\") " Apr 21 14:59:55.299238 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.299102 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:55.299238 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.299119 2610 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.299507 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.299451 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:55.300537 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.299767 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:55.300537 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.300407 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:55.300885 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.300843 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:59:55.301433 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.301394 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.302227 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.302173 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.302503 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.302327 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.302810 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.302778 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.303454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.303421 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.303675 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.303541 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-config" (OuterVolumeSpecName: "config") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.303675 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.303654 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.303839 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.303813 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:59:55.304643 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.304620 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.304840 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.304817 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-kube-api-access-rp7v5" (OuterVolumeSpecName: "kube-api-access-rp7v5") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "kube-api-access-rp7v5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:59:55.305367 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.305348 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-config-out" (OuterVolumeSpecName: "config-out") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:59:55.315188 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.315167 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-web-config" (OuterVolumeSpecName: "web-config") pod "484f360a-e0dd-419c-bfbb-758bbe78f68e" (UID: "484f360a-e0dd-419c-bfbb-758bbe78f68e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:55.400206 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400169 2610 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400206 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400202 2610 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400206 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400212 2610 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-tls-assets\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400221 2610 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-metrics-client-ca\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400231 2610 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-grpc-tls\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400240 2610 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400249 2610 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400258 2610 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400267 2610 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-config\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400275 2610 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-k8s-db\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400283 2610 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400292 2610 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rp7v5\" (UniqueName: \"kubernetes.io/projected/484f360a-e0dd-419c-bfbb-758bbe78f68e-kube-api-access-rp7v5\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400300 2610 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/484f360a-e0dd-419c-bfbb-758bbe78f68e-config-out\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400310 2610 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-web-config\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400318 2610 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-kube-rbac-proxy\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400326 2610 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/484f360a-e0dd-419c-bfbb-758bbe78f68e-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.400417 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.400335 2610 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/484f360a-e0dd-419c-bfbb-758bbe78f68e-secret-metrics-client-certs\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 14:59:55.530999 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.530964 2610 generic.go:358] "Generic (PLEG): container finished" podID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" exitCode=0 Apr 21 14:59:55.530999 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.530995 2610 generic.go:358] "Generic (PLEG): container finished" podID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" exitCode=0 Apr 21 14:59:55.530999 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531003 2610 generic.go:358] "Generic (PLEG): container finished" podID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" exitCode=0 Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531012 2610 generic.go:358] "Generic (PLEG): container finished" podID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" exitCode=0 Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531019 2610 generic.go:358] "Generic (PLEG): container finished" podID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" exitCode=0 Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531026 2610 generic.go:358] "Generic (PLEG): container finished" podID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" exitCode=0 Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531044 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerDied","Data":"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5"} Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531083 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerDied","Data":"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80"} Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531095 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerDied","Data":"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3"} Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531105 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerDied","Data":"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e"} Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531115 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerDied","Data":"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee"} Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531114 2610 scope.go:117] "RemoveContainer" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531124 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerDied","Data":"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286"} Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531134 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"484f360a-e0dd-419c-bfbb-758bbe78f68e","Type":"ContainerDied","Data":"761992a80599b99706106b8c138e17718233c3b804c2810bc4429cd544580513"} Apr 21 14:59:55.531262 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.531095 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.539623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.539601 2610 scope.go:117] "RemoveContainer" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" Apr 21 14:59:55.546411 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.546394 2610 scope.go:117] "RemoveContainer" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" Apr 21 14:59:55.552864 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.552848 2610 scope.go:117] "RemoveContainer" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" Apr 21 14:59:55.555603 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.555565 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:59:55.559425 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.559409 2610 scope.go:117] "RemoveContainer" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" Apr 21 14:59:55.561143 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.561124 2610 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:59:55.566272 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.566255 2610 scope.go:117] "RemoveContainer" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" Apr 21 14:59:55.572835 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.572818 2610 scope.go:117] "RemoveContainer" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" Apr 21 14:59:55.579018 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.579004 2610 scope.go:117] "RemoveContainer" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" Apr 21 14:59:55.579273 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:59:55.579252 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": container with ID starting with 586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5 not found: ID does not exist" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" Apr 21 14:59:55.579376 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.579287 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5"} err="failed to get container status \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": rpc error: code = NotFound desc = could not find container \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": container with ID starting with 586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5 not found: ID does not exist" Apr 21 14:59:55.579376 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.579310 2610 scope.go:117] "RemoveContainer" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" Apr 21 14:59:55.579598 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:59:55.579552 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": container with ID starting with c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80 not found: ID does not exist" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" Apr 21 14:59:55.579765 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.579604 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80"} err="failed to get container status \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": rpc error: code = NotFound desc = could not find container \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": container with ID starting with c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80 not found: ID does not exist" Apr 21 14:59:55.579765 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.579625 2610 scope.go:117] "RemoveContainer" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" Apr 21 14:59:55.579887 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:59:55.579869 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": container with ID starting with bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3 not found: ID does not exist" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" Apr 21 14:59:55.579943 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.579896 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3"} err="failed to get container status \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": rpc error: code = NotFound desc = could not find container \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": container with ID starting with bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3 not found: ID does not exist" Apr 21 14:59:55.579943 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.579916 2610 scope.go:117] "RemoveContainer" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" Apr 21 14:59:55.580148 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:59:55.580129 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": container with ID starting with 7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e not found: ID does not exist" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" Apr 21 14:59:55.580182 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.580153 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e"} err="failed to get container status \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": rpc error: code = NotFound desc = could not find container \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": container with ID starting with 7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e not found: ID does not exist" Apr 21 14:59:55.580182 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.580169 2610 scope.go:117] "RemoveContainer" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" Apr 21 14:59:55.580400 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:59:55.580384 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": container with ID starting with 7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee not found: ID does not exist" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" Apr 21 14:59:55.580444 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.580405 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee"} err="failed to get container status \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": rpc error: code = NotFound desc = could not find container \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": container with ID starting with 7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee not found: ID does not exist" Apr 21 14:59:55.580444 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.580419 2610 scope.go:117] "RemoveContainer" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" Apr 21 14:59:55.580658 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:59:55.580641 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": container with ID starting with 40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286 not found: ID does not exist" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" Apr 21 14:59:55.580731 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.580663 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286"} err="failed to get container status \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": rpc error: code = NotFound desc = could not find container \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": container with ID starting with 40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286 not found: ID does not exist" Apr 21 14:59:55.580731 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.580684 2610 scope.go:117] "RemoveContainer" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" Apr 21 14:59:55.580934 ip-10-0-129-133 kubenswrapper[2610]: E0421 14:59:55.580918 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": container with ID starting with 5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e not found: ID does not exist" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" Apr 21 14:59:55.580992 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.580938 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e"} err="failed to get container status \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": rpc error: code = NotFound desc = could not find container \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": container with ID starting with 5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e not found: ID does not exist" Apr 21 14:59:55.580992 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.580954 2610 scope.go:117] "RemoveContainer" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" Apr 21 14:59:55.581193 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.581170 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5"} err="failed to get container status \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": rpc error: code = NotFound desc = could not find container \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": container with ID starting with 586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5 not found: ID does not exist" Apr 21 14:59:55.581257 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.581193 2610 scope.go:117] "RemoveContainer" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" Apr 21 14:59:55.581394 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.581370 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80"} err="failed to get container status \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": rpc error: code = NotFound desc = could not find container \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": container with ID starting with c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80 not found: ID does not exist" Apr 21 14:59:55.581394 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.581394 2610 scope.go:117] "RemoveContainer" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" Apr 21 14:59:55.581676 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.581659 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3"} err="failed to get container status \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": rpc error: code = NotFound desc = could not find container \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": container with ID starting with bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3 not found: ID does not exist" Apr 21 14:59:55.581723 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.581679 2610 scope.go:117] "RemoveContainer" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" Apr 21 14:59:55.581871 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.581856 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e"} err="failed to get container status \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": rpc error: code = NotFound desc = could not find container \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": container with ID starting with 7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e not found: ID does not exist" Apr 21 14:59:55.581907 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.581872 2610 scope.go:117] "RemoveContainer" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" Apr 21 14:59:55.582068 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582054 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee"} err="failed to get container status \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": rpc error: code = NotFound desc = could not find container \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": container with ID starting with 7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee not found: ID does not exist" Apr 21 14:59:55.582108 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582068 2610 scope.go:117] "RemoveContainer" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" Apr 21 14:59:55.582298 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582270 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286"} err="failed to get container status \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": rpc error: code = NotFound desc = could not find container \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": container with ID starting with 40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286 not found: ID does not exist" Apr 21 14:59:55.582338 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582300 2610 scope.go:117] "RemoveContainer" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" Apr 21 14:59:55.582496 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582481 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e"} err="failed to get container status \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": rpc error: code = NotFound desc = could not find container \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": container with ID starting with 5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e not found: ID does not exist" Apr 21 14:59:55.582595 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582496 2610 scope.go:117] "RemoveContainer" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" Apr 21 14:59:55.582740 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582720 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5"} err="failed to get container status \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": rpc error: code = NotFound desc = could not find container \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": container with ID starting with 586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5 not found: ID does not exist" Apr 21 14:59:55.582787 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582741 2610 scope.go:117] "RemoveContainer" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" Apr 21 14:59:55.582945 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582930 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80"} err="failed to get container status \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": rpc error: code = NotFound desc = could not find container \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": container with ID starting with c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80 not found: ID does not exist" Apr 21 14:59:55.582985 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.582944 2610 scope.go:117] "RemoveContainer" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" Apr 21 14:59:55.583151 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583130 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3"} err="failed to get container status \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": rpc error: code = NotFound desc = could not find container \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": container with ID starting with bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3 not found: ID does not exist" Apr 21 14:59:55.583151 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583151 2610 scope.go:117] "RemoveContainer" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" Apr 21 14:59:55.583338 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583321 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e"} err="failed to get container status \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": rpc error: code = NotFound desc = could not find container \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": container with ID starting with 7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e not found: ID does not exist" Apr 21 14:59:55.583377 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583338 2610 scope.go:117] "RemoveContainer" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" Apr 21 14:59:55.583532 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583513 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee"} err="failed to get container status \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": rpc error: code = NotFound desc = could not find container \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": container with ID starting with 7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee not found: ID does not exist" Apr 21 14:59:55.583594 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583534 2610 scope.go:117] "RemoveContainer" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" Apr 21 14:59:55.583759 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583740 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286"} err="failed to get container status \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": rpc error: code = NotFound desc = could not find container \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": container with ID starting with 40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286 not found: ID does not exist" Apr 21 14:59:55.583832 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583760 2610 scope.go:117] "RemoveContainer" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" Apr 21 14:59:55.583974 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583956 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e"} err="failed to get container status \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": rpc error: code = NotFound desc = could not find container \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": container with ID starting with 5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e not found: ID does not exist" Apr 21 14:59:55.584018 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.583977 2610 scope.go:117] "RemoveContainer" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" Apr 21 14:59:55.584160 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584143 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5"} err="failed to get container status \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": rpc error: code = NotFound desc = could not find container \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": container with ID starting with 586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5 not found: ID does not exist" Apr 21 14:59:55.584224 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584161 2610 scope.go:117] "RemoveContainer" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" Apr 21 14:59:55.584385 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584364 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80"} err="failed to get container status \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": rpc error: code = NotFound desc = could not find container \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": container with ID starting with c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80 not found: ID does not exist" Apr 21 14:59:55.584431 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584386 2610 scope.go:117] "RemoveContainer" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" Apr 21 14:59:55.584598 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584559 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3"} err="failed to get container status \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": rpc error: code = NotFound desc = could not find container \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": container with ID starting with bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3 not found: ID does not exist" Apr 21 14:59:55.584669 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584602 2610 scope.go:117] "RemoveContainer" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" Apr 21 14:59:55.584811 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584794 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e"} err="failed to get container status \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": rpc error: code = NotFound desc = could not find container \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": container with ID starting with 7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e not found: ID does not exist" Apr 21 14:59:55.584859 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584811 2610 scope.go:117] "RemoveContainer" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" Apr 21 14:59:55.585001 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584974 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee"} err="failed to get container status \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": rpc error: code = NotFound desc = could not find container \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": container with ID starting with 7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee not found: ID does not exist" Apr 21 14:59:55.585001 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.584999 2610 scope.go:117] "RemoveContainer" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" Apr 21 14:59:55.585183 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.585162 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286"} err="failed to get container status \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": rpc error: code = NotFound desc = could not find container \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": container with ID starting with 40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286 not found: ID does not exist" Apr 21 14:59:55.585234 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.585184 2610 scope.go:117] "RemoveContainer" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" Apr 21 14:59:55.585387 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.585372 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e"} err="failed to get container status \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": rpc error: code = NotFound desc = could not find container \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": container with ID starting with 5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e not found: ID does not exist" Apr 21 14:59:55.585431 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.585388 2610 scope.go:117] "RemoveContainer" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" Apr 21 14:59:55.585629 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.585565 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5"} err="failed to get container status \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": rpc error: code = NotFound desc = could not find container \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": container with ID starting with 586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5 not found: ID does not exist" Apr 21 14:59:55.585681 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.585633 2610 scope.go:117] "RemoveContainer" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" Apr 21 14:59:55.585841 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.585822 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80"} err="failed to get container status \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": rpc error: code = NotFound desc = could not find container \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": container with ID starting with c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80 not found: ID does not exist" Apr 21 14:59:55.585921 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.585846 2610 scope.go:117] "RemoveContainer" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" Apr 21 14:59:55.586040 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586023 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3"} err="failed to get container status \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": rpc error: code = NotFound desc = could not find container \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": container with ID starting with bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3 not found: ID does not exist" Apr 21 14:59:55.586087 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586040 2610 scope.go:117] "RemoveContainer" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" Apr 21 14:59:55.586201 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586186 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e"} err="failed to get container status \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": rpc error: code = NotFound desc = could not find container \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": container with ID starting with 7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e not found: ID does not exist" Apr 21 14:59:55.586249 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586202 2610 scope.go:117] "RemoveContainer" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" Apr 21 14:59:55.586398 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586380 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee"} err="failed to get container status \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": rpc error: code = NotFound desc = could not find container \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": container with ID starting with 7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee not found: ID does not exist" Apr 21 14:59:55.586462 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586399 2610 scope.go:117] "RemoveContainer" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" Apr 21 14:59:55.586633 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586565 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286"} err="failed to get container status \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": rpc error: code = NotFound desc = could not find container \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": container with ID starting with 40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286 not found: ID does not exist" Apr 21 14:59:55.586702 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586636 2610 scope.go:117] "RemoveContainer" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" Apr 21 14:59:55.586824 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586807 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e"} err="failed to get container status \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": rpc error: code = NotFound desc = could not find container \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": container with ID starting with 5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e not found: ID does not exist" Apr 21 14:59:55.586866 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.586825 2610 scope.go:117] "RemoveContainer" containerID="586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5" Apr 21 14:59:55.587035 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587018 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5"} err="failed to get container status \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": rpc error: code = NotFound desc = could not find container \"586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5\": container with ID starting with 586d647ea635c451721398e97383e469831f6b8152c6640e90734512b9ba7ef5 not found: ID does not exist" Apr 21 14:59:55.587100 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587037 2610 scope.go:117] "RemoveContainer" containerID="c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80" Apr 21 14:59:55.587250 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587231 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80"} err="failed to get container status \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": rpc error: code = NotFound desc = could not find container \"c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80\": container with ID starting with c0943ec802acf092ca21b535a44948aa28bf4432e54fbbc1ed424a2cb479ae80 not found: ID does not exist" Apr 21 14:59:55.587303 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587251 2610 scope.go:117] "RemoveContainer" containerID="bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3" Apr 21 14:59:55.587438 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587421 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3"} err="failed to get container status \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": rpc error: code = NotFound desc = could not find container \"bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3\": container with ID starting with bf6740b3996600a554ff56d494c3f92cd3caf842a144846ebbfd1688d6491ed3 not found: ID does not exist" Apr 21 14:59:55.587501 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587439 2610 scope.go:117] "RemoveContainer" containerID="7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e" Apr 21 14:59:55.587658 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587640 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e"} err="failed to get container status \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": rpc error: code = NotFound desc = could not find container \"7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e\": container with ID starting with 7f2448c70f07e1d6a50bbce602167fdc17c14866b86929124e98b59718ba012e not found: ID does not exist" Apr 21 14:59:55.587710 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587658 2610 scope.go:117] "RemoveContainer" containerID="7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee" Apr 21 14:59:55.587841 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587827 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee"} err="failed to get container status \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": rpc error: code = NotFound desc = could not find container \"7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee\": container with ID starting with 7276a5aa3611da2f32b7f6f5915d895419dd171f39b84777f9bde593efcc05ee not found: ID does not exist" Apr 21 14:59:55.587885 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.587842 2610 scope.go:117] "RemoveContainer" containerID="40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286" Apr 21 14:59:55.588049 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.588025 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286"} err="failed to get container status \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": rpc error: code = NotFound desc = could not find container \"40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286\": container with ID starting with 40bfa1c660a1fa9bcb4cb608bb14b1d92f3519d960935791f39f62650aecb286 not found: ID does not exist" Apr 21 14:59:55.588049 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.588048 2610 scope.go:117] "RemoveContainer" containerID="5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e" Apr 21 14:59:55.588233 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.588214 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e"} err="failed to get container status \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": rpc error: code = NotFound desc = could not find container \"5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e\": container with ID starting with 5f7a8e1571e7f70440c3efc11d3b5c7516e6817ce74c7e325b41c5620c66bd8e not found: ID does not exist" Apr 21 14:59:55.593105 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593085 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:59:55.593350 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593337 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="prometheus" Apr 21 14:59:55.593350 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593350 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="prometheus" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593360 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="init-config-reloader" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593366 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="init-config-reloader" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593376 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy-web" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593381 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy-web" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593393 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="thanos-sidecar" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593399 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="thanos-sidecar" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593406 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593411 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593419 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" containerName="registry" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593424 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" containerName="registry" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593430 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy-thanos" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593435 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy-thanos" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593441 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="config-reloader" Apr 21 14:59:55.593454 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593446 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="config-reloader" Apr 21 14:59:55.594085 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593489 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy" Apr 21 14:59:55.594085 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593498 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="prometheus" Apr 21 14:59:55.594085 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593504 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="config-reloader" Apr 21 14:59:55.594085 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593510 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy-web" Apr 21 14:59:55.594085 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593517 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ac89af4-5925-4a52-a694-31a92b841ed6" containerName="registry" Apr 21 14:59:55.594085 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593523 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="kube-rbac-proxy-thanos" Apr 21 14:59:55.594085 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.593530 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" containerName="thanos-sidecar" Apr 21 14:59:55.598881 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.598864 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.601062 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601044 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 14:59:55.601141 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601044 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 14:59:55.601190 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601047 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 14:59:55.601228 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601211 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 14:59:55.601289 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601273 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 14:59:55.601360 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601342 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-l5jw5\"" Apr 21 14:59:55.601615 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601597 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 14:59:55.601695 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601629 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 14:59:55.601695 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.601647 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 14:59:55.602462 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.602447 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 14:59:55.602539 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.602453 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 14:59:55.603438 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.603423 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f3ddpi19md28c\"" Apr 21 14:59:55.604774 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.604755 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 14:59:55.608604 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.608559 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 14:59:55.614100 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.614078 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:59:55.703224 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703192 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703387 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703235 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703387 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703280 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703387 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703302 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-config-out\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703387 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703330 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703387 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703366 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703387 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703384 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hq2j\" (UniqueName: \"kubernetes.io/projected/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-kube-api-access-2hq2j\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703400 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-config\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703416 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703464 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703499 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-web-config\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703523 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703540 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703559 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703623 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703600 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703853 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703653 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703853 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703676 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.703853 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.703700 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.804819 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.804733 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.804819 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.804772 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.804819 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.804798 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805070 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.804936 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805070 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.804986 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805070 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805026 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805070 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805055 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-config-out\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805267 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805099 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805267 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805132 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805267 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805159 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hq2j\" (UniqueName: \"kubernetes.io/projected/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-kube-api-access-2hq2j\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805267 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805189 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-config\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805267 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805213 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805267 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805244 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805542 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805274 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-web-config\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805542 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805305 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805542 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805334 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805542 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805367 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805542 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805394 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805907 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805712 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.805907 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805731 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.806016 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.805986 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.806241 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.806088 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.808451 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.808270 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.808451 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.808320 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.808647 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.808455 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.808690 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.808656 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-config-out\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.808690 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.808676 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.808808 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.808783 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.809021 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.808999 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-config\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.809321 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.809297 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.809418 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.809343 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.810385 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.810363 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.811459 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.811427 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.811534 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.811441 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-web-config\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.811644 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.811629 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.814078 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.814056 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hq2j\" (UniqueName: \"kubernetes.io/projected/2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5-kube-api-access-2hq2j\") pod \"prometheus-k8s-0\" (UID: \"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:55.908827 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:55.908786 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:59:56.039164 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:56.039035 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:59:56.044535 ip-10-0-129-133 kubenswrapper[2610]: W0421 14:59:56.044330 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2952e283_ff73_4bc3_9b8f_2ae4a4b32ee5.slice/crio-266b7540c6a954a9f686ec28fb02069bc5656fe07eeade2640c82c0d7ff64f18 WatchSource:0}: Error finding container 266b7540c6a954a9f686ec28fb02069bc5656fe07eeade2640c82c0d7ff64f18: Status 404 returned error can't find the container with id 266b7540c6a954a9f686ec28fb02069bc5656fe07eeade2640c82c0d7ff64f18 Apr 21 14:59:56.535894 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:56.535855 2610 generic.go:358] "Generic (PLEG): container finished" podID="2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5" containerID="07da2c0348ccb3c16f83a95ef6e161342a97708711de66b20a4c9203dbd4df9e" exitCode=0 Apr 21 14:59:56.536204 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:56.535938 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5","Type":"ContainerDied","Data":"07da2c0348ccb3c16f83a95ef6e161342a97708711de66b20a4c9203dbd4df9e"} Apr 21 14:59:56.536204 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:56.535969 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5","Type":"ContainerStarted","Data":"266b7540c6a954a9f686ec28fb02069bc5656fe07eeade2640c82c0d7ff64f18"} Apr 21 14:59:56.626351 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:56.626325 2610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484f360a-e0dd-419c-bfbb-758bbe78f68e" path="/var/lib/kubelet/pods/484f360a-e0dd-419c-bfbb-758bbe78f68e/volumes" Apr 21 14:59:57.542481 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:57.542446 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5","Type":"ContainerStarted","Data":"8f724fb3ca4da8b41c2cd45732fe42219b6df5302c5caaf5491c10862707a766"} Apr 21 14:59:57.542481 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:57.542480 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5","Type":"ContainerStarted","Data":"57e71658e4813cc0a5b4615ab4215bbd297893053edcc3d5e0c6a6f5e91e125f"} Apr 21 14:59:57.542907 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:57.542489 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5","Type":"ContainerStarted","Data":"45b6dd2f1dc15d76c7a9eaf69aaf49cca6cbe22bf0c25ced2e918b9e921dda70"} Apr 21 14:59:57.542907 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:57.542500 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5","Type":"ContainerStarted","Data":"5b347b2853a662f4b9209eb8c4f4c34c8615ccb7c29037ac3721c1d7165aae8a"} Apr 21 14:59:57.542907 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:57.542508 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5","Type":"ContainerStarted","Data":"f73a9e1b3cfd733deba6a67f20fb5174daf3af20aa48155a6d5d243a367dba11"} Apr 21 14:59:57.542907 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:57.542516 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5","Type":"ContainerStarted","Data":"cae818c4489979ca1ea9672193e919ed8529212fbf8fbad75d072806afc2e2fe"} Apr 21 14:59:57.573755 ip-10-0-129-133 kubenswrapper[2610]: I0421 14:59:57.573705 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.573689506 podStartE2EDuration="2.573689506s" podCreationTimestamp="2026-04-21 14:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:59:57.572415715 +0000 UTC m=+261.556307762" watchObservedRunningTime="2026-04-21 14:59:57.573689506 +0000 UTC m=+261.557581554" Apr 21 15:00:00.909877 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:00:00.909835 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:00:36.497332 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:00:36.497295 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:00:36.498345 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:00:36.498144 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:00:36.509882 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:00:36.509864 2610 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:00:55.909270 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:00:55.909233 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:00:55.925212 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:00:55.925184 2610 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:00:56.729077 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:00:56.729052 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:02:57.999173 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:57.999095 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr"] Apr 21 15:02:58.002141 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.002125 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" Apr 21 15:02:58.003981 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.003962 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 15:02:58.004362 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.004345 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-fhh77\"" Apr 21 15:02:58.004446 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.004349 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:02:58.011525 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.011503 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr"] Apr 21 15:02:58.077902 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.077862 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfdsc\" (UniqueName: \"kubernetes.io/projected/6fc4122e-7ed8-4da1-8ee2-8373944c2cbc-kube-api-access-xfdsc\") pod \"openshift-lws-operator-bfc7f696d-qzsjr\" (UID: \"6fc4122e-7ed8-4da1-8ee2-8373944c2cbc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" Apr 21 15:02:58.078083 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.077910 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fc4122e-7ed8-4da1-8ee2-8373944c2cbc-tmp\") pod \"openshift-lws-operator-bfc7f696d-qzsjr\" (UID: \"6fc4122e-7ed8-4da1-8ee2-8373944c2cbc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" Apr 21 15:02:58.178455 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.178411 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fc4122e-7ed8-4da1-8ee2-8373944c2cbc-tmp\") pod \"openshift-lws-operator-bfc7f696d-qzsjr\" (UID: \"6fc4122e-7ed8-4da1-8ee2-8373944c2cbc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" Apr 21 15:02:58.178649 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.178525 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfdsc\" (UniqueName: \"kubernetes.io/projected/6fc4122e-7ed8-4da1-8ee2-8373944c2cbc-kube-api-access-xfdsc\") pod \"openshift-lws-operator-bfc7f696d-qzsjr\" (UID: \"6fc4122e-7ed8-4da1-8ee2-8373944c2cbc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" Apr 21 15:02:58.178831 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.178809 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fc4122e-7ed8-4da1-8ee2-8373944c2cbc-tmp\") pod \"openshift-lws-operator-bfc7f696d-qzsjr\" (UID: \"6fc4122e-7ed8-4da1-8ee2-8373944c2cbc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" Apr 21 15:02:58.186732 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.186709 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfdsc\" (UniqueName: \"kubernetes.io/projected/6fc4122e-7ed8-4da1-8ee2-8373944c2cbc-kube-api-access-xfdsc\") pod \"openshift-lws-operator-bfc7f696d-qzsjr\" (UID: \"6fc4122e-7ed8-4da1-8ee2-8373944c2cbc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" Apr 21 15:02:58.330096 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.330004 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" Apr 21 15:02:58.473343 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.473309 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr"] Apr 21 15:02:58.480385 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:02:58.480353 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fc4122e_7ed8_4da1_8ee2_8373944c2cbc.slice/crio-64559662c5dfe96d819efabca95f3be13f1b02ba3b5863f86b722129d348f902 WatchSource:0}: Error finding container 64559662c5dfe96d819efabca95f3be13f1b02ba3b5863f86b722129d348f902: Status 404 returned error can't find the container with id 64559662c5dfe96d819efabca95f3be13f1b02ba3b5863f86b722129d348f902 Apr 21 15:02:58.481897 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:58.481877 2610 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:02:59.041773 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:02:59.041737 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" event={"ID":"6fc4122e-7ed8-4da1-8ee2-8373944c2cbc","Type":"ContainerStarted","Data":"64559662c5dfe96d819efabca95f3be13f1b02ba3b5863f86b722129d348f902"} Apr 21 15:03:02.053720 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:02.053684 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" event={"ID":"6fc4122e-7ed8-4da1-8ee2-8373944c2cbc","Type":"ContainerStarted","Data":"80f120184ec60913a6475ec89b2ecff51285f5991adbf32b2b76a3ff513e64ef"} Apr 21 15:03:02.074653 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:02.074596 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-qzsjr" podStartSLOduration=2.032973321 podStartE2EDuration="5.074559642s" podCreationTimestamp="2026-04-21 15:02:57 +0000 UTC" firstStartedPulling="2026-04-21 15:02:58.482039072 +0000 UTC m=+442.465931099" lastFinishedPulling="2026-04-21 15:03:01.523625392 +0000 UTC m=+445.507517420" observedRunningTime="2026-04-21 15:03:02.073172716 +0000 UTC m=+446.057064769" watchObservedRunningTime="2026-04-21 15:03:02.074559642 +0000 UTC m=+446.058451694" Apr 21 15:03:18.939567 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:18.939529 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m"] Apr 21 15:03:18.942848 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:18.942831 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:18.962359 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:18.962315 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 15:03:18.962549 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:18.962498 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 15:03:18.962804 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:18.962786 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 15:03:18.962919 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:18.962883 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 15:03:18.967467 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:18.967449 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-86sts\"" Apr 21 15:03:19.000441 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.000414 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m"] Apr 21 15:03:19.058003 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.057975 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b20329a-2270-4e86-9339-3f99d193e016-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.058160 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.058017 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b20329a-2270-4e86-9339-3f99d193e016-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.058160 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.058054 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77kr\" (UniqueName: \"kubernetes.io/projected/4b20329a-2270-4e86-9339-3f99d193e016-kube-api-access-k77kr\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.158742 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.158697 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b20329a-2270-4e86-9339-3f99d193e016-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.158919 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.158752 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b20329a-2270-4e86-9339-3f99d193e016-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.158919 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.158793 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k77kr\" (UniqueName: \"kubernetes.io/projected/4b20329a-2270-4e86-9339-3f99d193e016-kube-api-access-k77kr\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.161362 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.161338 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b20329a-2270-4e86-9339-3f99d193e016-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.161467 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.161388 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b20329a-2270-4e86-9339-3f99d193e016-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.178457 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.178423 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77kr\" (UniqueName: \"kubernetes.io/projected/4b20329a-2270-4e86-9339-3f99d193e016-kube-api-access-k77kr\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-jck6m\" (UID: \"4b20329a-2270-4e86-9339-3f99d193e016\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.252498 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.252467 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:19.398666 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:19.398634 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m"] Apr 21 15:03:19.409097 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:03:19.409068 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b20329a_2270_4e86_9339_3f99d193e016.slice/crio-a6e1305ef998d61749804ae760116e93e8482c200494093f7c12acca27e59194 WatchSource:0}: Error finding container a6e1305ef998d61749804ae760116e93e8482c200494093f7c12acca27e59194: Status 404 returned error can't find the container with id a6e1305ef998d61749804ae760116e93e8482c200494093f7c12acca27e59194 Apr 21 15:03:20.107276 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:20.107235 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" event={"ID":"4b20329a-2270-4e86-9339-3f99d193e016","Type":"ContainerStarted","Data":"a6e1305ef998d61749804ae760116e93e8482c200494093f7c12acca27e59194"} Apr 21 15:03:22.115661 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:22.115622 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" event={"ID":"4b20329a-2270-4e86-9339-3f99d193e016","Type":"ContainerStarted","Data":"c0c6cf33adda337c31e241245de1886106ba5cef9b58bb2fc14fe1e48e4a9e5d"} Apr 21 15:03:22.116048 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:22.115761 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:22.136477 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:22.136422 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" podStartSLOduration=1.57194476 podStartE2EDuration="4.136408363s" podCreationTimestamp="2026-04-21 15:03:18 +0000 UTC" firstStartedPulling="2026-04-21 15:03:19.410819101 +0000 UTC m=+463.394711127" lastFinishedPulling="2026-04-21 15:03:21.9752827 +0000 UTC m=+465.959174730" observedRunningTime="2026-04-21 15:03:22.1346891 +0000 UTC m=+466.118581148" watchObservedRunningTime="2026-04-21 15:03:22.136408363 +0000 UTC m=+466.120300487" Apr 21 15:03:31.335302 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.335255 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz"] Apr 21 15:03:31.338649 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.338631 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.341703 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.341678 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 15:03:31.341841 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.341739 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-sz9p8\"" Apr 21 15:03:31.342029 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.342010 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 15:03:31.342138 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.342066 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 15:03:31.355091 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.355065 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz"] Apr 21 15:03:31.461359 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.461318 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k9k5\" (UniqueName: \"kubernetes.io/projected/62cf4840-d50f-418c-8cf6-52fb90e36787-kube-api-access-5k9k5\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.461524 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.461387 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/62cf4840-d50f-418c-8cf6-52fb90e36787-metrics-cert\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.461524 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.461417 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/62cf4840-d50f-418c-8cf6-52fb90e36787-cert\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.461524 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.461445 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/62cf4840-d50f-418c-8cf6-52fb90e36787-manager-config\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.562788 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.562736 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k9k5\" (UniqueName: \"kubernetes.io/projected/62cf4840-d50f-418c-8cf6-52fb90e36787-kube-api-access-5k9k5\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.562957 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.562830 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/62cf4840-d50f-418c-8cf6-52fb90e36787-metrics-cert\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.562957 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.562862 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/62cf4840-d50f-418c-8cf6-52fb90e36787-cert\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.562957 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.562887 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/62cf4840-d50f-418c-8cf6-52fb90e36787-manager-config\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.563496 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.563477 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/62cf4840-d50f-418c-8cf6-52fb90e36787-manager-config\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.565585 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.565558 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/62cf4840-d50f-418c-8cf6-52fb90e36787-metrics-cert\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.565656 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.565637 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/62cf4840-d50f-418c-8cf6-52fb90e36787-cert\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.570961 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.570937 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k9k5\" (UniqueName: \"kubernetes.io/projected/62cf4840-d50f-418c-8cf6-52fb90e36787-kube-api-access-5k9k5\") pod \"lws-controller-manager-586c4cccd6-kq2hz\" (UID: \"62cf4840-d50f-418c-8cf6-52fb90e36787\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.647862 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.647768 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:31.814121 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:31.814098 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz"] Apr 21 15:03:31.816881 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:03:31.816839 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62cf4840_d50f_418c_8cf6_52fb90e36787.slice/crio-326b36cd06f19ffb40c2c050bfd998e01d576e6051ba4734b7ba9f3f7122193e WatchSource:0}: Error finding container 326b36cd06f19ffb40c2c050bfd998e01d576e6051ba4734b7ba9f3f7122193e: Status 404 returned error can't find the container with id 326b36cd06f19ffb40c2c050bfd998e01d576e6051ba4734b7ba9f3f7122193e Apr 21 15:03:32.147293 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:32.147256 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" event={"ID":"62cf4840-d50f-418c-8cf6-52fb90e36787","Type":"ContainerStarted","Data":"326b36cd06f19ffb40c2c050bfd998e01d576e6051ba4734b7ba9f3f7122193e"} Apr 21 15:03:33.120688 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:33.120655 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-jck6m" Apr 21 15:03:35.158701 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:35.158668 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" event={"ID":"62cf4840-d50f-418c-8cf6-52fb90e36787","Type":"ContainerStarted","Data":"f763f41d3e9d8fac546c7f80e020ff15376e4e20695b190a3191ee94c9d186c5"} Apr 21 15:03:35.159111 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:35.158822 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:03:35.214431 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:35.214376 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" podStartSLOduration=1.780845464 podStartE2EDuration="4.21436138s" podCreationTimestamp="2026-04-21 15:03:31 +0000 UTC" firstStartedPulling="2026-04-21 15:03:31.818686691 +0000 UTC m=+475.802578720" lastFinishedPulling="2026-04-21 15:03:34.25220261 +0000 UTC m=+478.236094636" observedRunningTime="2026-04-21 15:03:35.212659108 +0000 UTC m=+479.196551155" watchObservedRunningTime="2026-04-21 15:03:35.21436138 +0000 UTC m=+479.198253428" Apr 21 15:03:39.213530 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.213497 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p"] Apr 21 15:03:39.220909 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.220883 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.223108 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.223081 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 15:03:39.223259 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.223238 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 15:03:39.223542 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.223523 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-k7m4x\"" Apr 21 15:03:39.227353 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.227331 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p"] Apr 21 15:03:39.325904 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.325868 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f85023e-bf45-4798-9875-d1af959fe30c-tmp\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.326073 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.325916 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzwb6\" (UniqueName: \"kubernetes.io/projected/0f85023e-bf45-4798-9875-d1af959fe30c-kube-api-access-bzwb6\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.326073 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.325939 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0f85023e-bf45-4798-9875-d1af959fe30c-tls-certs\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.426326 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.426285 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f85023e-bf45-4798-9875-d1af959fe30c-tmp\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.426475 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.426337 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzwb6\" (UniqueName: \"kubernetes.io/projected/0f85023e-bf45-4798-9875-d1af959fe30c-kube-api-access-bzwb6\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.426475 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.426360 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0f85023e-bf45-4798-9875-d1af959fe30c-tls-certs\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.428728 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.428703 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f85023e-bf45-4798-9875-d1af959fe30c-tmp\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.428995 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.428978 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0f85023e-bf45-4798-9875-d1af959fe30c-tls-certs\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.434393 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.434371 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzwb6\" (UniqueName: \"kubernetes.io/projected/0f85023e-bf45-4798-9875-d1af959fe30c-kube-api-access-bzwb6\") pod \"kube-auth-proxy-5bcc894b57-6xj5p\" (UID: \"0f85023e-bf45-4798-9875-d1af959fe30c\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.532710 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.532677 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" Apr 21 15:03:39.659413 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:39.659389 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p"] Apr 21 15:03:39.662188 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:03:39.662163 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f85023e_bf45_4798_9875_d1af959fe30c.slice/crio-265c6f8ddc3a801cd2635a41b027a61bc2e27024a527e45bcea1f6fc84306e9d WatchSource:0}: Error finding container 265c6f8ddc3a801cd2635a41b027a61bc2e27024a527e45bcea1f6fc84306e9d: Status 404 returned error can't find the container with id 265c6f8ddc3a801cd2635a41b027a61bc2e27024a527e45bcea1f6fc84306e9d Apr 21 15:03:40.175597 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:40.175544 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" event={"ID":"0f85023e-bf45-4798-9875-d1af959fe30c","Type":"ContainerStarted","Data":"265c6f8ddc3a801cd2635a41b027a61bc2e27024a527e45bcea1f6fc84306e9d"} Apr 21 15:03:44.193787 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:44.193704 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" event={"ID":"0f85023e-bf45-4798-9875-d1af959fe30c","Type":"ContainerStarted","Data":"7dad673f6eb874c1b06ae743888347f90b04372233c8a07e192c5639cec89c58"} Apr 21 15:03:44.216759 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:44.216708 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-6xj5p" podStartSLOduration=1.130023368 podStartE2EDuration="5.216693167s" podCreationTimestamp="2026-04-21 15:03:39 +0000 UTC" firstStartedPulling="2026-04-21 15:03:39.664286397 +0000 UTC m=+483.648178422" lastFinishedPulling="2026-04-21 15:03:43.750956195 +0000 UTC m=+487.734848221" observedRunningTime="2026-04-21 15:03:44.213544542 +0000 UTC m=+488.197436590" watchObservedRunningTime="2026-04-21 15:03:44.216693167 +0000 UTC m=+488.200585267" Apr 21 15:03:46.164201 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:03:46.164173 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-kq2hz" Apr 21 15:05:36.528883 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:05:36.528846 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:05:36.529506 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:05:36.529489 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:06:21.207066 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.207028 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-f78sx"] Apr 21 15:06:21.210464 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.210441 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:21.213491 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.213468 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 15:06:21.213944 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.213924 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 15:06:21.214048 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.214011 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 15:06:21.214138 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.214122 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cdlh8\"" Apr 21 15:06:21.255832 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.255797 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-f78sx"] Apr 21 15:06:21.321295 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.321261 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-f78sx"] Apr 21 15:06:21.349124 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.349092 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-config-file\") pod \"limitador-limitador-7d549b5b-f78sx\" (UID: \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:21.349288 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.349182 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtqnm\" (UniqueName: \"kubernetes.io/projected/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-kube-api-access-qtqnm\") pod \"limitador-limitador-7d549b5b-f78sx\" (UID: \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:21.450112 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.450075 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtqnm\" (UniqueName: \"kubernetes.io/projected/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-kube-api-access-qtqnm\") pod \"limitador-limitador-7d549b5b-f78sx\" (UID: \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:21.450112 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.450119 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-config-file\") pod \"limitador-limitador-7d549b5b-f78sx\" (UID: \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:21.450767 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.450748 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-config-file\") pod \"limitador-limitador-7d549b5b-f78sx\" (UID: \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:21.463112 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.463055 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtqnm\" (UniqueName: \"kubernetes.io/projected/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-kube-api-access-qtqnm\") pod \"limitador-limitador-7d549b5b-f78sx\" (UID: \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:21.520756 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.520720 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:21.684348 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.684324 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-f78sx"] Apr 21 15:06:21.686268 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:06:21.686237 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda206aa0f_17d9_4d6c_8b15_c2298c69f2b2.slice/crio-b9df69fcd8c234c47b6ba68caf3f4fd1711fee4ef7de201727efee7559977203 WatchSource:0}: Error finding container b9df69fcd8c234c47b6ba68caf3f4fd1711fee4ef7de201727efee7559977203: Status 404 returned error can't find the container with id b9df69fcd8c234c47b6ba68caf3f4fd1711fee4ef7de201727efee7559977203 Apr 21 15:06:21.690773 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:21.690748 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" event={"ID":"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2","Type":"ContainerStarted","Data":"b9df69fcd8c234c47b6ba68caf3f4fd1711fee4ef7de201727efee7559977203"} Apr 21 15:06:22.437826 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.437787 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-6tpzl"] Apr 21 15:06:22.441313 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.441290 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6tpzl" Apr 21 15:06:22.443780 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.443760 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wqcsw\"" Apr 21 15:06:22.452130 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.452106 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-6tpzl"] Apr 21 15:06:22.560335 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.560299 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbc5x\" (UniqueName: \"kubernetes.io/projected/ba680588-6b22-41b3-8dcc-ab3ead6e8fee-kube-api-access-kbc5x\") pod \"authorino-7498df8756-6tpzl\" (UID: \"ba680588-6b22-41b3-8dcc-ab3ead6e8fee\") " pod="kuadrant-system/authorino-7498df8756-6tpzl" Apr 21 15:06:22.661564 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.661449 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbc5x\" (UniqueName: \"kubernetes.io/projected/ba680588-6b22-41b3-8dcc-ab3ead6e8fee-kube-api-access-kbc5x\") pod \"authorino-7498df8756-6tpzl\" (UID: \"ba680588-6b22-41b3-8dcc-ab3ead6e8fee\") " pod="kuadrant-system/authorino-7498df8756-6tpzl" Apr 21 15:06:22.671012 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.670973 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbc5x\" (UniqueName: \"kubernetes.io/projected/ba680588-6b22-41b3-8dcc-ab3ead6e8fee-kube-api-access-kbc5x\") pod \"authorino-7498df8756-6tpzl\" (UID: \"ba680588-6b22-41b3-8dcc-ab3ead6e8fee\") " pod="kuadrant-system/authorino-7498df8756-6tpzl" Apr 21 15:06:22.751265 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.751232 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6tpzl" Apr 21 15:06:22.942075 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:22.942020 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-6tpzl"] Apr 21 15:06:22.946740 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:06:22.946695 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba680588_6b22_41b3_8dcc_ab3ead6e8fee.slice/crio-883fbc1a2c5ca20c68743cd5976d40503d4d11f003b62044e40060d32bd871dc WatchSource:0}: Error finding container 883fbc1a2c5ca20c68743cd5976d40503d4d11f003b62044e40060d32bd871dc: Status 404 returned error can't find the container with id 883fbc1a2c5ca20c68743cd5976d40503d4d11f003b62044e40060d32bd871dc Apr 21 15:06:23.700184 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:23.700141 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6tpzl" event={"ID":"ba680588-6b22-41b3-8dcc-ab3ead6e8fee","Type":"ContainerStarted","Data":"883fbc1a2c5ca20c68743cd5976d40503d4d11f003b62044e40060d32bd871dc"} Apr 21 15:06:25.712908 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:25.712617 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" event={"ID":"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2","Type":"ContainerStarted","Data":"047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112"} Apr 21 15:06:25.712908 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:25.712691 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:25.735605 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:25.735522 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" podStartSLOduration=1.652647582 podStartE2EDuration="4.735507828s" podCreationTimestamp="2026-04-21 15:06:21 +0000 UTC" firstStartedPulling="2026-04-21 15:06:21.687947163 +0000 UTC m=+645.671839189" lastFinishedPulling="2026-04-21 15:06:24.770807395 +0000 UTC m=+648.754699435" observedRunningTime="2026-04-21 15:06:25.733719602 +0000 UTC m=+649.717611650" watchObservedRunningTime="2026-04-21 15:06:25.735507828 +0000 UTC m=+649.719399876" Apr 21 15:06:26.717535 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:26.717484 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6tpzl" event={"ID":"ba680588-6b22-41b3-8dcc-ab3ead6e8fee","Type":"ContainerStarted","Data":"e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d"} Apr 21 15:06:26.738024 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:26.737977 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-6tpzl" podStartSLOduration=1.10501393 podStartE2EDuration="4.737961139s" podCreationTimestamp="2026-04-21 15:06:22 +0000 UTC" firstStartedPulling="2026-04-21 15:06:22.949076376 +0000 UTC m=+646.932968404" lastFinishedPulling="2026-04-21 15:06:26.582023588 +0000 UTC m=+650.565915613" observedRunningTime="2026-04-21 15:06:26.736918383 +0000 UTC m=+650.720810431" watchObservedRunningTime="2026-04-21 15:06:26.737961139 +0000 UTC m=+650.721853231" Apr 21 15:06:36.719164 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:36.719133 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:36.745237 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:36.745205 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-f78sx"] Apr 21 15:06:36.748604 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:36.748528 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" podUID="a206aa0f-17d9-4d6c-8b15-c2298c69f2b2" containerName="limitador" containerID="cri-o://047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112" gracePeriod=30 Apr 21 15:06:37.290419 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.290396 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:37.391679 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.391568 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-config-file\") pod \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\" (UID: \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\") " Apr 21 15:06:37.391679 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.391646 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtqnm\" (UniqueName: \"kubernetes.io/projected/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-kube-api-access-qtqnm\") pod \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\" (UID: \"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2\") " Apr 21 15:06:37.391935 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.391908 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-config-file" (OuterVolumeSpecName: "config-file") pod "a206aa0f-17d9-4d6c-8b15-c2298c69f2b2" (UID: "a206aa0f-17d9-4d6c-8b15-c2298c69f2b2"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:06:37.394045 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.394011 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-kube-api-access-qtqnm" (OuterVolumeSpecName: "kube-api-access-qtqnm") pod "a206aa0f-17d9-4d6c-8b15-c2298c69f2b2" (UID: "a206aa0f-17d9-4d6c-8b15-c2298c69f2b2"). InnerVolumeSpecName "kube-api-access-qtqnm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:06:37.492851 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.492817 2610 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtqnm\" (UniqueName: \"kubernetes.io/projected/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-kube-api-access-qtqnm\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 15:06:37.492851 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.492847 2610 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2-config-file\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 15:06:37.752372 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.752340 2610 generic.go:358] "Generic (PLEG): container finished" podID="a206aa0f-17d9-4d6c-8b15-c2298c69f2b2" containerID="047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112" exitCode=0 Apr 21 15:06:37.752844 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.752412 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" Apr 21 15:06:37.752844 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.752433 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" event={"ID":"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2","Type":"ContainerDied","Data":"047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112"} Apr 21 15:06:37.752844 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.752482 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-f78sx" event={"ID":"a206aa0f-17d9-4d6c-8b15-c2298c69f2b2","Type":"ContainerDied","Data":"b9df69fcd8c234c47b6ba68caf3f4fd1711fee4ef7de201727efee7559977203"} Apr 21 15:06:37.752844 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.752498 2610 scope.go:117] "RemoveContainer" containerID="047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112" Apr 21 15:06:37.761708 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.761690 2610 scope.go:117] "RemoveContainer" containerID="047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112" Apr 21 15:06:37.762004 ip-10-0-129-133 kubenswrapper[2610]: E0421 15:06:37.761979 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112\": container with ID starting with 047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112 not found: ID does not exist" containerID="047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112" Apr 21 15:06:37.762101 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.762011 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112"} err="failed to get container status \"047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112\": rpc error: code = NotFound desc = could not find container \"047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112\": container with ID starting with 047fdaf5682994e932c0e8dcff8dd9cde698b54aad4f7eb485f19dc79e6d2112 not found: ID does not exist" Apr 21 15:06:37.773240 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.773215 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-f78sx"] Apr 21 15:06:37.780770 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:37.779227 2610 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-f78sx"] Apr 21 15:06:38.626557 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:38.626525 2610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a206aa0f-17d9-4d6c-8b15-c2298c69f2b2" path="/var/lib/kubelet/pods/a206aa0f-17d9-4d6c-8b15-c2298c69f2b2/volumes" Apr 21 15:06:42.619568 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.619532 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-wh579"] Apr 21 15:06:42.619967 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.619873 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a206aa0f-17d9-4d6c-8b15-c2298c69f2b2" containerName="limitador" Apr 21 15:06:42.619967 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.619886 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a206aa0f-17d9-4d6c-8b15-c2298c69f2b2" containerName="limitador" Apr 21 15:06:42.619967 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.619948 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="a206aa0f-17d9-4d6c-8b15-c2298c69f2b2" containerName="limitador" Apr 21 15:06:42.622630 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.622600 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:42.624555 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.624530 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 15:06:42.624659 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.624609 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-297q5\"" Apr 21 15:06:42.631361 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.631330 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-wh579"] Apr 21 15:06:42.739117 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.739076 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxhm4\" (UniqueName: \"kubernetes.io/projected/3f6c0ccf-bc8a-467c-b976-97198d25282c-kube-api-access-dxhm4\") pod \"postgres-868db5846d-wh579\" (UID: \"3f6c0ccf-bc8a-467c-b976-97198d25282c\") " pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:42.739296 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.739227 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3f6c0ccf-bc8a-467c-b976-97198d25282c-data\") pod \"postgres-868db5846d-wh579\" (UID: \"3f6c0ccf-bc8a-467c-b976-97198d25282c\") " pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:42.839931 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.839891 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxhm4\" (UniqueName: \"kubernetes.io/projected/3f6c0ccf-bc8a-467c-b976-97198d25282c-kube-api-access-dxhm4\") pod \"postgres-868db5846d-wh579\" (UID: \"3f6c0ccf-bc8a-467c-b976-97198d25282c\") " pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:42.839931 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.839934 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3f6c0ccf-bc8a-467c-b976-97198d25282c-data\") pod \"postgres-868db5846d-wh579\" (UID: \"3f6c0ccf-bc8a-467c-b976-97198d25282c\") " pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:42.840283 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.840268 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3f6c0ccf-bc8a-467c-b976-97198d25282c-data\") pod \"postgres-868db5846d-wh579\" (UID: \"3f6c0ccf-bc8a-467c-b976-97198d25282c\") " pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:42.847673 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.847644 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxhm4\" (UniqueName: \"kubernetes.io/projected/3f6c0ccf-bc8a-467c-b976-97198d25282c-kube-api-access-dxhm4\") pod \"postgres-868db5846d-wh579\" (UID: \"3f6c0ccf-bc8a-467c-b976-97198d25282c\") " pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:42.935487 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:42.935406 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:43.061384 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:43.061353 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-wh579"] Apr 21 15:06:43.064291 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:06:43.064259 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6c0ccf_bc8a_467c_b976_97198d25282c.slice/crio-ea0a5143569d29ab526276cdb0457f0c88ea940432b834e1f1882e9f31fb46a2 WatchSource:0}: Error finding container ea0a5143569d29ab526276cdb0457f0c88ea940432b834e1f1882e9f31fb46a2: Status 404 returned error can't find the container with id ea0a5143569d29ab526276cdb0457f0c88ea940432b834e1f1882e9f31fb46a2 Apr 21 15:06:43.775023 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:43.774988 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-wh579" event={"ID":"3f6c0ccf-bc8a-467c-b976-97198d25282c","Type":"ContainerStarted","Data":"ea0a5143569d29ab526276cdb0457f0c88ea940432b834e1f1882e9f31fb46a2"} Apr 21 15:06:48.798078 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:48.798044 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-wh579" event={"ID":"3f6c0ccf-bc8a-467c-b976-97198d25282c","Type":"ContainerStarted","Data":"176c81c40a16c3d6f793d79c71b113c704ead17248c0bb336f0749774c43445f"} Apr 21 15:06:48.798546 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:48.798191 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:06:48.818471 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:48.818423 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-wh579" podStartSLOduration=1.436875651 podStartE2EDuration="6.818409674s" podCreationTimestamp="2026-04-21 15:06:42 +0000 UTC" firstStartedPulling="2026-04-21 15:06:43.065649199 +0000 UTC m=+667.049541225" lastFinishedPulling="2026-04-21 15:06:48.447183222 +0000 UTC m=+672.431075248" observedRunningTime="2026-04-21 15:06:48.814956873 +0000 UTC m=+672.798848920" watchObservedRunningTime="2026-04-21 15:06:48.818409674 +0000 UTC m=+672.802301721" Apr 21 15:06:54.832714 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:06:54.832684 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-wh579" Apr 21 15:07:05.892985 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:05.892947 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-547q8"] Apr 21 15:07:05.895145 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:05.895128 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-547q8" Apr 21 15:07:05.900305 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:05.900279 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 15:07:05.900781 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:05.900763 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-csnvj\"" Apr 21 15:07:05.900907 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:05.900764 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 15:07:05.911536 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:05.911505 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt9vj\" (UniqueName: \"kubernetes.io/projected/345a8b2a-9f4a-4b09-97e0-20eec530d511-kube-api-access-wt9vj\") pod \"keycloak-operator-5c4df598dd-547q8\" (UID: \"345a8b2a-9f4a-4b09-97e0-20eec530d511\") " pod="keycloak-system/keycloak-operator-5c4df598dd-547q8" Apr 21 15:07:05.913517 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:05.913473 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-547q8"] Apr 21 15:07:06.011988 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:06.011953 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt9vj\" (UniqueName: \"kubernetes.io/projected/345a8b2a-9f4a-4b09-97e0-20eec530d511-kube-api-access-wt9vj\") pod \"keycloak-operator-5c4df598dd-547q8\" (UID: \"345a8b2a-9f4a-4b09-97e0-20eec530d511\") " pod="keycloak-system/keycloak-operator-5c4df598dd-547q8" Apr 21 15:07:06.028300 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:06.028267 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt9vj\" (UniqueName: \"kubernetes.io/projected/345a8b2a-9f4a-4b09-97e0-20eec530d511-kube-api-access-wt9vj\") pod \"keycloak-operator-5c4df598dd-547q8\" (UID: \"345a8b2a-9f4a-4b09-97e0-20eec530d511\") " pod="keycloak-system/keycloak-operator-5c4df598dd-547q8" Apr 21 15:07:06.205379 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:06.205300 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-547q8" Apr 21 15:07:06.340704 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:06.340671 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-547q8"] Apr 21 15:07:06.344132 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:07:06.344101 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod345a8b2a_9f4a_4b09_97e0_20eec530d511.slice/crio-245cb7b6133a64143da0353ca2cfca4f5b3185936450e9c78052b72aa1a9f1e3 WatchSource:0}: Error finding container 245cb7b6133a64143da0353ca2cfca4f5b3185936450e9c78052b72aa1a9f1e3: Status 404 returned error can't find the container with id 245cb7b6133a64143da0353ca2cfca4f5b3185936450e9c78052b72aa1a9f1e3 Apr 21 15:07:06.858990 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:06.858953 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-547q8" event={"ID":"345a8b2a-9f4a-4b09-97e0-20eec530d511","Type":"ContainerStarted","Data":"245cb7b6133a64143da0353ca2cfca4f5b3185936450e9c78052b72aa1a9f1e3"} Apr 21 15:07:12.883353 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:12.883305 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-547q8" event={"ID":"345a8b2a-9f4a-4b09-97e0-20eec530d511","Type":"ContainerStarted","Data":"13b47036b71b5e0ea7a828efeb78890793a2de711c1b2f7bad7fd823ca396cb1"} Apr 21 15:07:52.572028 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:52.571967 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-547q8" podStartSLOduration=41.249308514 podStartE2EDuration="47.571952158s" podCreationTimestamp="2026-04-21 15:07:05 +0000 UTC" firstStartedPulling="2026-04-21 15:07:06.345409566 +0000 UTC m=+690.329301592" lastFinishedPulling="2026-04-21 15:07:12.668053211 +0000 UTC m=+696.651945236" observedRunningTime="2026-04-21 15:07:12.920414647 +0000 UTC m=+696.904306694" watchObservedRunningTime="2026-04-21 15:07:52.571952158 +0000 UTC m=+736.555844206" Apr 21 15:07:52.572491 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:52.572085 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v8l52"] Apr 21 15:07:52.578525 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:52.578503 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-v8l52" Apr 21 15:07:52.600599 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:52.600545 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v8l52"] Apr 21 15:07:52.724103 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:52.724070 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bqk\" (UniqueName: \"kubernetes.io/projected/36c2d683-6c9e-409b-af3f-8bd4b1888087-kube-api-access-m4bqk\") pod \"authorino-8b475cf9f-v8l52\" (UID: \"36c2d683-6c9e-409b-af3f-8bd4b1888087\") " pod="kuadrant-system/authorino-8b475cf9f-v8l52" Apr 21 15:07:52.818612 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:52.818562 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v8l52"] Apr 21 15:07:52.818866 ip-10-0-129-133 kubenswrapper[2610]: E0421 15:07:52.818832 2610 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-m4bqk], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-v8l52" podUID="36c2d683-6c9e-409b-af3f-8bd4b1888087" Apr 21 15:07:52.825430 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:52.825358 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bqk\" (UniqueName: \"kubernetes.io/projected/36c2d683-6c9e-409b-af3f-8bd4b1888087-kube-api-access-m4bqk\") pod \"authorino-8b475cf9f-v8l52\" (UID: \"36c2d683-6c9e-409b-af3f-8bd4b1888087\") " pod="kuadrant-system/authorino-8b475cf9f-v8l52" Apr 21 15:07:52.842462 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:52.842441 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bqk\" (UniqueName: \"kubernetes.io/projected/36c2d683-6c9e-409b-af3f-8bd4b1888087-kube-api-access-m4bqk\") pod \"authorino-8b475cf9f-v8l52\" (UID: \"36c2d683-6c9e-409b-af3f-8bd4b1888087\") " pod="kuadrant-system/authorino-8b475cf9f-v8l52" Apr 21 15:07:53.017012 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:53.016986 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-v8l52" Apr 21 15:07:53.021770 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:53.021748 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-v8l52" Apr 21 15:07:53.128012 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:53.127913 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4bqk\" (UniqueName: \"kubernetes.io/projected/36c2d683-6c9e-409b-af3f-8bd4b1888087-kube-api-access-m4bqk\") pod \"36c2d683-6c9e-409b-af3f-8bd4b1888087\" (UID: \"36c2d683-6c9e-409b-af3f-8bd4b1888087\") " Apr 21 15:07:53.130060 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:53.130037 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c2d683-6c9e-409b-af3f-8bd4b1888087-kube-api-access-m4bqk" (OuterVolumeSpecName: "kube-api-access-m4bqk") pod "36c2d683-6c9e-409b-af3f-8bd4b1888087" (UID: "36c2d683-6c9e-409b-af3f-8bd4b1888087"). InnerVolumeSpecName "kube-api-access-m4bqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:07:53.228790 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:53.228757 2610 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m4bqk\" (UniqueName: \"kubernetes.io/projected/36c2d683-6c9e-409b-af3f-8bd4b1888087-kube-api-access-m4bqk\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 15:07:54.019972 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.019936 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-v8l52" Apr 21 15:07:54.058941 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.058909 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v8l52"] Apr 21 15:07:54.062112 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.062084 2610 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v8l52"] Apr 21 15:07:54.155796 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.155768 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-6tpzl"] Apr 21 15:07:54.155966 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.155946 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-6tpzl" podUID="ba680588-6b22-41b3-8dcc-ab3ead6e8fee" containerName="authorino" containerID="cri-o://e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d" gracePeriod=30 Apr 21 15:07:54.408026 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.408005 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6tpzl" Apr 21 15:07:54.540329 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.540241 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbc5x\" (UniqueName: \"kubernetes.io/projected/ba680588-6b22-41b3-8dcc-ab3ead6e8fee-kube-api-access-kbc5x\") pod \"ba680588-6b22-41b3-8dcc-ab3ead6e8fee\" (UID: \"ba680588-6b22-41b3-8dcc-ab3ead6e8fee\") " Apr 21 15:07:54.542424 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.542393 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba680588-6b22-41b3-8dcc-ab3ead6e8fee-kube-api-access-kbc5x" (OuterVolumeSpecName: "kube-api-access-kbc5x") pod "ba680588-6b22-41b3-8dcc-ab3ead6e8fee" (UID: "ba680588-6b22-41b3-8dcc-ab3ead6e8fee"). InnerVolumeSpecName "kube-api-access-kbc5x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:07:54.625392 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.625354 2610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c2d683-6c9e-409b-af3f-8bd4b1888087" path="/var/lib/kubelet/pods/36c2d683-6c9e-409b-af3f-8bd4b1888087/volumes" Apr 21 15:07:54.641456 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:54.641434 2610 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbc5x\" (UniqueName: \"kubernetes.io/projected/ba680588-6b22-41b3-8dcc-ab3ead6e8fee-kube-api-access-kbc5x\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 15:07:55.024407 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.024373 2610 generic.go:358] "Generic (PLEG): container finished" podID="ba680588-6b22-41b3-8dcc-ab3ead6e8fee" containerID="e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d" exitCode=0 Apr 21 15:07:55.024845 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.024424 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6tpzl" Apr 21 15:07:55.024845 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.024456 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6tpzl" event={"ID":"ba680588-6b22-41b3-8dcc-ab3ead6e8fee","Type":"ContainerDied","Data":"e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d"} Apr 21 15:07:55.024845 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.024494 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6tpzl" event={"ID":"ba680588-6b22-41b3-8dcc-ab3ead6e8fee","Type":"ContainerDied","Data":"883fbc1a2c5ca20c68743cd5976d40503d4d11f003b62044e40060d32bd871dc"} Apr 21 15:07:55.024845 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.024510 2610 scope.go:117] "RemoveContainer" containerID="e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d" Apr 21 15:07:55.032522 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.032501 2610 scope.go:117] "RemoveContainer" containerID="e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d" Apr 21 15:07:55.032794 ip-10-0-129-133 kubenswrapper[2610]: E0421 15:07:55.032772 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d\": container with ID starting with e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d not found: ID does not exist" containerID="e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d" Apr 21 15:07:55.032847 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.032803 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d"} err="failed to get container status \"e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d\": rpc error: code = NotFound desc = could not find container \"e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d\": container with ID starting with e8be0e227c026fd23f10f0f39768595421fa29680853ff612247c36ea2f8081d not found: ID does not exist" Apr 21 15:07:55.050829 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.050800 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-6tpzl"] Apr 21 15:07:55.065413 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:55.065389 2610 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-6tpzl"] Apr 21 15:07:56.625790 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:07:56.625753 2610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba680588-6b22-41b3-8dcc-ab3ead6e8fee" path="/var/lib/kubelet/pods/ba680588-6b22-41b3-8dcc-ab3ead6e8fee/volumes" Apr 21 15:09:12.772644 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.772539 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-675f5f7d99-sbtlg"] Apr 21 15:09:12.773218 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.773046 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba680588-6b22-41b3-8dcc-ab3ead6e8fee" containerName="authorino" Apr 21 15:09:12.773218 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.773066 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba680588-6b22-41b3-8dcc-ab3ead6e8fee" containerName="authorino" Apr 21 15:09:12.773218 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.773169 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba680588-6b22-41b3-8dcc-ab3ead6e8fee" containerName="authorino" Apr 21 15:09:12.775055 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.775034 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.777736 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.777718 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 15:09:12.778070 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.778054 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 15:09:12.778531 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.778513 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 15:09:12.778729 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.778585 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wqcsw\"" Apr 21 15:09:12.778839 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.778822 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 21 15:09:12.788208 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.788188 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-675f5f7d99-sbtlg"] Apr 21 15:09:12.869229 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.869193 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxp5g\" (UniqueName: \"kubernetes.io/projected/08a0c2d4-9e75-4601-8592-fa86229571ae-kube-api-access-xxp5g\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.869229 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.869230 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/08a0c2d4-9e75-4601-8592-fa86229571ae-tls-cert\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.869464 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.869351 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/08a0c2d4-9e75-4601-8592-fa86229571ae-oidc-ca\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.970378 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.970338 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/08a0c2d4-9e75-4601-8592-fa86229571ae-oidc-ca\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.970569 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.970400 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxp5g\" (UniqueName: \"kubernetes.io/projected/08a0c2d4-9e75-4601-8592-fa86229571ae-kube-api-access-xxp5g\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.970569 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.970424 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/08a0c2d4-9e75-4601-8592-fa86229571ae-tls-cert\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.971085 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.971063 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/08a0c2d4-9e75-4601-8592-fa86229571ae-oidc-ca\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.973014 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.972991 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/08a0c2d4-9e75-4601-8592-fa86229571ae-tls-cert\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:12.983242 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:12.983211 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxp5g\" (UniqueName: \"kubernetes.io/projected/08a0c2d4-9e75-4601-8592-fa86229571ae-kube-api-access-xxp5g\") pod \"authorino-675f5f7d99-sbtlg\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:13.084127 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:13.084023 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:09:13.208510 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:13.208478 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-675f5f7d99-sbtlg"] Apr 21 15:09:13.211912 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:09:13.211885 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a0c2d4_9e75_4601_8592_fa86229571ae.slice/crio-4baccce9d9594d0d0c89ae9631839d217811a064bb679ad3009f0864684ed3ed WatchSource:0}: Error finding container 4baccce9d9594d0d0c89ae9631839d217811a064bb679ad3009f0864684ed3ed: Status 404 returned error can't find the container with id 4baccce9d9594d0d0c89ae9631839d217811a064bb679ad3009f0864684ed3ed Apr 21 15:09:13.213354 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:13.213331 2610 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:09:13.282966 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:13.282933 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" event={"ID":"08a0c2d4-9e75-4601-8592-fa86229571ae","Type":"ContainerStarted","Data":"4baccce9d9594d0d0c89ae9631839d217811a064bb679ad3009f0864684ed3ed"} Apr 21 15:09:14.288527 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:14.288492 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" event={"ID":"08a0c2d4-9e75-4601-8592-fa86229571ae","Type":"ContainerStarted","Data":"19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62"} Apr 21 15:09:14.309436 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:09:14.309375 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" podStartSLOduration=1.743089348 podStartE2EDuration="2.309361474s" podCreationTimestamp="2026-04-21 15:09:12 +0000 UTC" firstStartedPulling="2026-04-21 15:09:13.21352391 +0000 UTC m=+817.197415942" lastFinishedPulling="2026-04-21 15:09:13.779796041 +0000 UTC m=+817.763688068" observedRunningTime="2026-04-21 15:09:14.309124882 +0000 UTC m=+818.293016930" watchObservedRunningTime="2026-04-21 15:09:14.309361474 +0000 UTC m=+818.293253503" Apr 21 15:10:36.555693 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:10:36.555608 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:10:36.556214 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:10:36.555787 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:11:00.920634 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:00.920604 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5b68f979cd-ww9s9"] Apr 21 15:11:00.922897 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:00.922879 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:00.942064 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:00.942039 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5b68f979cd-ww9s9"] Apr 21 15:11:00.985911 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:00.985863 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/41aa9d95-23b7-4da8-a6e6-a6233c264532-oidc-ca\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:00.986101 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:00.985951 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88jw\" (UniqueName: \"kubernetes.io/projected/41aa9d95-23b7-4da8-a6e6-a6233c264532-kube-api-access-q88jw\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:00.986101 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:00.985981 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/41aa9d95-23b7-4da8-a6e6-a6233c264532-tls-cert\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:01.087076 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.087033 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/41aa9d95-23b7-4da8-a6e6-a6233c264532-oidc-ca\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:01.087247 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.087102 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q88jw\" (UniqueName: \"kubernetes.io/projected/41aa9d95-23b7-4da8-a6e6-a6233c264532-kube-api-access-q88jw\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:01.087247 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.087130 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/41aa9d95-23b7-4da8-a6e6-a6233c264532-tls-cert\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:01.087695 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.087666 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/41aa9d95-23b7-4da8-a6e6-a6233c264532-oidc-ca\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:01.089751 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.089732 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/41aa9d95-23b7-4da8-a6e6-a6233c264532-tls-cert\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:01.096398 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.096372 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88jw\" (UniqueName: \"kubernetes.io/projected/41aa9d95-23b7-4da8-a6e6-a6233c264532-kube-api-access-q88jw\") pod \"authorino-5b68f979cd-ww9s9\" (UID: \"41aa9d95-23b7-4da8-a6e6-a6233c264532\") " pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:01.232610 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.232480 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b68f979cd-ww9s9" Apr 21 15:11:01.381697 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.381658 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5b68f979cd-ww9s9"] Apr 21 15:11:01.387377 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:11:01.387349 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41aa9d95_23b7_4da8_a6e6_a6233c264532.slice/crio-66a9000d15e36b665b06469138ec373805a7160d017da5b933a9148a2a26858e WatchSource:0}: Error finding container 66a9000d15e36b665b06469138ec373805a7160d017da5b933a9148a2a26858e: Status 404 returned error can't find the container with id 66a9000d15e36b665b06469138ec373805a7160d017da5b933a9148a2a26858e Apr 21 15:11:01.649681 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:01.649640 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5b68f979cd-ww9s9" event={"ID":"41aa9d95-23b7-4da8-a6e6-a6233c264532","Type":"ContainerStarted","Data":"66a9000d15e36b665b06469138ec373805a7160d017da5b933a9148a2a26858e"} Apr 21 15:11:02.655168 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:02.655136 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5b68f979cd-ww9s9" event={"ID":"41aa9d95-23b7-4da8-a6e6-a6233c264532","Type":"ContainerStarted","Data":"f01c5f0a66aee7abb02e245dddc12d854ba8a79d52bb4092baea6c75a2487219"} Apr 21 15:11:02.716700 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:02.716649 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5b68f979cd-ww9s9" podStartSLOduration=2.349702642 podStartE2EDuration="2.716634118s" podCreationTimestamp="2026-04-21 15:11:00 +0000 UTC" firstStartedPulling="2026-04-21 15:11:01.389012618 +0000 UTC m=+925.372904643" lastFinishedPulling="2026-04-21 15:11:01.755944092 +0000 UTC m=+925.739836119" observedRunningTime="2026-04-21 15:11:02.716526429 +0000 UTC m=+926.700418478" watchObservedRunningTime="2026-04-21 15:11:02.716634118 +0000 UTC m=+926.700526166" Apr 21 15:11:02.797381 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:02.797349 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-675f5f7d99-sbtlg"] Apr 21 15:11:02.797604 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:02.797552 2610 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" podUID="08a0c2d4-9e75-4601-8592-fa86229571ae" containerName="authorino" containerID="cri-o://19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62" gracePeriod=30 Apr 21 15:11:03.036890 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.036862 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:11:03.101629 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.101565 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxp5g\" (UniqueName: \"kubernetes.io/projected/08a0c2d4-9e75-4601-8592-fa86229571ae-kube-api-access-xxp5g\") pod \"08a0c2d4-9e75-4601-8592-fa86229571ae\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " Apr 21 15:11:03.101629 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.101630 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/08a0c2d4-9e75-4601-8592-fa86229571ae-tls-cert\") pod \"08a0c2d4-9e75-4601-8592-fa86229571ae\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " Apr 21 15:11:03.101862 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.101672 2610 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/08a0c2d4-9e75-4601-8592-fa86229571ae-oidc-ca\") pod \"08a0c2d4-9e75-4601-8592-fa86229571ae\" (UID: \"08a0c2d4-9e75-4601-8592-fa86229571ae\") " Apr 21 15:11:03.104471 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.104418 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a0c2d4-9e75-4601-8592-fa86229571ae-kube-api-access-xxp5g" (OuterVolumeSpecName: "kube-api-access-xxp5g") pod "08a0c2d4-9e75-4601-8592-fa86229571ae" (UID: "08a0c2d4-9e75-4601-8592-fa86229571ae"). InnerVolumeSpecName "kube-api-access-xxp5g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:11:03.107244 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.107214 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a0c2d4-9e75-4601-8592-fa86229571ae-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "08a0c2d4-9e75-4601-8592-fa86229571ae" (UID: "08a0c2d4-9e75-4601-8592-fa86229571ae"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:11:03.112711 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.112684 2610 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a0c2d4-9e75-4601-8592-fa86229571ae-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "08a0c2d4-9e75-4601-8592-fa86229571ae" (UID: "08a0c2d4-9e75-4601-8592-fa86229571ae"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:11:03.202905 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.202820 2610 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/08a0c2d4-9e75-4601-8592-fa86229571ae-oidc-ca\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 15:11:03.202905 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.202851 2610 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxp5g\" (UniqueName: \"kubernetes.io/projected/08a0c2d4-9e75-4601-8592-fa86229571ae-kube-api-access-xxp5g\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 15:11:03.202905 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.202862 2610 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/08a0c2d4-9e75-4601-8592-fa86229571ae-tls-cert\") on node \"ip-10-0-129-133.ec2.internal\" DevicePath \"\"" Apr 21 15:11:03.660510 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.660475 2610 generic.go:358] "Generic (PLEG): container finished" podID="08a0c2d4-9e75-4601-8592-fa86229571ae" containerID="19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62" exitCode=0 Apr 21 15:11:03.660990 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.660530 2610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" Apr 21 15:11:03.660990 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.660563 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" event={"ID":"08a0c2d4-9e75-4601-8592-fa86229571ae","Type":"ContainerDied","Data":"19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62"} Apr 21 15:11:03.660990 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.660610 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-675f5f7d99-sbtlg" event={"ID":"08a0c2d4-9e75-4601-8592-fa86229571ae","Type":"ContainerDied","Data":"4baccce9d9594d0d0c89ae9631839d217811a064bb679ad3009f0864684ed3ed"} Apr 21 15:11:03.660990 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.660628 2610 scope.go:117] "RemoveContainer" containerID="19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62" Apr 21 15:11:03.669203 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.669185 2610 scope.go:117] "RemoveContainer" containerID="19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62" Apr 21 15:11:03.669450 ip-10-0-129-133 kubenswrapper[2610]: E0421 15:11:03.669434 2610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62\": container with ID starting with 19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62 not found: ID does not exist" containerID="19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62" Apr 21 15:11:03.669514 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.669496 2610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62"} err="failed to get container status \"19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62\": rpc error: code = NotFound desc = could not find container \"19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62\": container with ID starting with 19f295590c4bc4f26601ebb5dac5fec1fc9541370d9d628662eee3e5ab787b62 not found: ID does not exist" Apr 21 15:11:03.716088 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.716057 2610 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-675f5f7d99-sbtlg"] Apr 21 15:11:03.729478 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:03.729446 2610 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-675f5f7d99-sbtlg"] Apr 21 15:11:04.626256 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:11:04.626209 2610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a0c2d4-9e75-4601-8592-fa86229571ae" path="/var/lib/kubelet/pods/08a0c2d4-9e75-4601-8592-fa86229571ae/volumes" Apr 21 15:15:36.584262 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:15:36.584232 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:15:36.586639 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:15:36.586616 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:20:36.606059 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:20:36.606030 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:20:36.609479 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:20:36.609453 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:25:36.638805 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:25:36.638778 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:25:36.643443 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:25:36.643417 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:30:36.659995 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:30:36.659957 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:30:36.665301 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:30:36.665275 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:35:36.682669 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:35:36.682633 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:35:36.693749 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:35:36.693720 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:38:14.122712 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:14.122675 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5b68f979cd-ww9s9_41aa9d95-23b7-4da8-a6e6-a6233c264532/authorino/0.log" Apr 21 15:38:18.544957 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:18.544927 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cfc874c8f-jck6m_4b20329a-2270-4e86-9339-3f99d193e016/manager/0.log" Apr 21 15:38:18.884251 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:18.884175 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-wh579_3f6c0ccf-bc8a-467c-b976-97198d25282c/postgres/0.log" Apr 21 15:38:20.122852 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:20.122823 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5b68f979cd-ww9s9_41aa9d95-23b7-4da8-a6e6-a6233c264532/authorino/0.log" Apr 21 15:38:21.475059 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:21.475028 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5bcc894b57-6xj5p_0f85023e-bf45-4798-9875-d1af959fe30c/kube-auth-proxy/0.log" Apr 21 15:38:21.821665 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:21.821636 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76d7d6f776-nvlj4_e22e5723-18d9-4194-867b-028f5e78e14d/router/0.log" Apr 21 15:38:26.896657 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.896620 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6hvv/must-gather-d4nnn"] Apr 21 15:38:26.897016 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.896974 2610 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08a0c2d4-9e75-4601-8592-fa86229571ae" containerName="authorino" Apr 21 15:38:26.897016 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.896987 2610 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a0c2d4-9e75-4601-8592-fa86229571ae" containerName="authorino" Apr 21 15:38:26.897089 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.897064 2610 memory_manager.go:356] "RemoveStaleState removing state" podUID="08a0c2d4-9e75-4601-8592-fa86229571ae" containerName="authorino" Apr 21 15:38:26.900180 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.900163 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6hvv/must-gather-d4nnn" Apr 21 15:38:26.902226 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.902201 2610 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b6hvv\"/\"default-dockercfg-kmhbt\"" Apr 21 15:38:26.902343 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.902201 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b6hvv\"/\"kube-root-ca.crt\"" Apr 21 15:38:26.902343 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.902296 2610 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b6hvv\"/\"openshift-service-ca.crt\"" Apr 21 15:38:26.917215 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.917190 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6hvv/must-gather-d4nnn"] Apr 21 15:38:26.954731 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.954699 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvsb6\" (UniqueName: \"kubernetes.io/projected/1d2a27f7-c9c3-4173-aa4d-da82e7ba765a-kube-api-access-qvsb6\") pod \"must-gather-d4nnn\" (UID: \"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a\") " pod="openshift-must-gather-b6hvv/must-gather-d4nnn" Apr 21 15:38:26.954846 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:26.954750 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d2a27f7-c9c3-4173-aa4d-da82e7ba765a-must-gather-output\") pod \"must-gather-d4nnn\" (UID: \"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a\") " pod="openshift-must-gather-b6hvv/must-gather-d4nnn" Apr 21 15:38:27.055362 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:27.055318 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvsb6\" (UniqueName: \"kubernetes.io/projected/1d2a27f7-c9c3-4173-aa4d-da82e7ba765a-kube-api-access-qvsb6\") pod \"must-gather-d4nnn\" (UID: \"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a\") " pod="openshift-must-gather-b6hvv/must-gather-d4nnn" Apr 21 15:38:27.055536 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:27.055392 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d2a27f7-c9c3-4173-aa4d-da82e7ba765a-must-gather-output\") pod \"must-gather-d4nnn\" (UID: \"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a\") " pod="openshift-must-gather-b6hvv/must-gather-d4nnn" Apr 21 15:38:27.055809 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:27.055790 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d2a27f7-c9c3-4173-aa4d-da82e7ba765a-must-gather-output\") pod \"must-gather-d4nnn\" (UID: \"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a\") " pod="openshift-must-gather-b6hvv/must-gather-d4nnn" Apr 21 15:38:27.063450 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:27.063417 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvsb6\" (UniqueName: \"kubernetes.io/projected/1d2a27f7-c9c3-4173-aa4d-da82e7ba765a-kube-api-access-qvsb6\") pod \"must-gather-d4nnn\" (UID: \"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a\") " pod="openshift-must-gather-b6hvv/must-gather-d4nnn" Apr 21 15:38:27.209672 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:27.209562 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6hvv/must-gather-d4nnn" Apr 21 15:38:27.335141 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:27.335107 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6hvv/must-gather-d4nnn"] Apr 21 15:38:27.338544 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:38:27.338515 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d2a27f7_c9c3_4173_aa4d_da82e7ba765a.slice/crio-c420aca0a50c20d050e72de5e32d0ab9e7aed89aaff741967647c750d4f20b80 WatchSource:0}: Error finding container c420aca0a50c20d050e72de5e32d0ab9e7aed89aaff741967647c750d4f20b80: Status 404 returned error can't find the container with id c420aca0a50c20d050e72de5e32d0ab9e7aed89aaff741967647c750d4f20b80 Apr 21 15:38:27.340262 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:27.340246 2610 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:38:28.089278 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:28.089243 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/must-gather-d4nnn" event={"ID":"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a","Type":"ContainerStarted","Data":"c420aca0a50c20d050e72de5e32d0ab9e7aed89aaff741967647c750d4f20b80"} Apr 21 15:38:29.095478 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:29.095424 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/must-gather-d4nnn" event={"ID":"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a","Type":"ContainerStarted","Data":"be0ec01d8f0eb69b34b4aa4db263c3ab111d7b80fbb5e320c6bcc0b06ac28d4b"} Apr 21 15:38:29.095478 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:29.095479 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/must-gather-d4nnn" event={"ID":"1d2a27f7-c9c3-4173-aa4d-da82e7ba765a","Type":"ContainerStarted","Data":"f53f30d8bddaa509077c51241011382c40a155885f3f4be02a5105fc2b305d7f"} Apr 21 15:38:29.111383 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:29.111333 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b6hvv/must-gather-d4nnn" podStartSLOduration=2.367474132 podStartE2EDuration="3.111317951s" podCreationTimestamp="2026-04-21 15:38:26 +0000 UTC" firstStartedPulling="2026-04-21 15:38:27.340372932 +0000 UTC m=+2571.324264958" lastFinishedPulling="2026-04-21 15:38:28.084216737 +0000 UTC m=+2572.068108777" observedRunningTime="2026-04-21 15:38:29.109470297 +0000 UTC m=+2573.093362346" watchObservedRunningTime="2026-04-21 15:38:29.111317951 +0000 UTC m=+2573.095210000" Apr 21 15:38:29.791264 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:29.791229 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-h5rsz_1649b770-32f3-4c98-9e33-13d820fcd898/global-pull-secret-syncer/0.log" Apr 21 15:38:29.915535 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:29.915502 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qgsvb_002f040e-530f-43cc-92d7-0789dd3ec88e/konnectivity-agent/0.log" Apr 21 15:38:29.968756 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:29.968713 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-133.ec2.internal_4074072fa96faab5923784feb5b91477/haproxy/0.log" Apr 21 15:38:34.671598 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:34.671031 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5b68f979cd-ww9s9_41aa9d95-23b7-4da8-a6e6-a6233c264532/authorino/0.log" Apr 21 15:38:36.607962 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:36.607926 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s2zz4_b4238647-950c-4b0e-ac27-6e6a0040c6dc/cluster-monitoring-operator/0.log" Apr 21 15:38:36.712690 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:36.712647 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-59d6dcb677-9wdsz_10c7a117-9bb1-4269-bcc2-28891c48c4e1/metrics-server/0.log" Apr 21 15:38:36.747338 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:36.747307 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-m7zgz_435ebbea-4b21-4fa3-883b-ef3b3e18723e/monitoring-plugin/0.log" Apr 21 15:38:36.856726 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:36.856697 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zcpxh_6e950fdc-c8a9-4a4e-ac1e-c78d8747299f/node-exporter/0.log" Apr 21 15:38:36.877226 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:36.877145 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zcpxh_6e950fdc-c8a9-4a4e-ac1e-c78d8747299f/kube-rbac-proxy/0.log" Apr 21 15:38:36.900730 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:36.900702 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zcpxh_6e950fdc-c8a9-4a4e-ac1e-c78d8747299f/init-textfile/0.log" Apr 21 15:38:37.011840 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.011807 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lmv67_5b20462c-711e-48e3-bfc6-772375505353/kube-rbac-proxy-main/0.log" Apr 21 15:38:37.034080 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.034051 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lmv67_5b20462c-711e-48e3-bfc6-772375505353/kube-rbac-proxy-self/0.log" Apr 21 15:38:37.058080 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.058045 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lmv67_5b20462c-711e-48e3-bfc6-772375505353/openshift-state-metrics/0.log" Apr 21 15:38:37.099365 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.099302 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5/prometheus/0.log" Apr 21 15:38:37.131533 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.131445 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5/config-reloader/0.log" Apr 21 15:38:37.153149 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.153124 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5/thanos-sidecar/0.log" Apr 21 15:38:37.174612 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.174559 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5/kube-rbac-proxy-web/0.log" Apr 21 15:38:37.195482 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.195421 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5/kube-rbac-proxy/0.log" Apr 21 15:38:37.215729 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.215703 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5/kube-rbac-proxy-thanos/0.log" Apr 21 15:38:37.235327 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:37.235278 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2952e283-ff73-4bc3-9b8f-2ae4a4b32ee5/init-config-reloader/0.log" Apr 21 15:38:38.343320 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.343292 2610 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf"] Apr 21 15:38:38.347654 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.347630 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.360955 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.360925 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf"] Apr 21 15:38:38.459134 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.459100 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-lib-modules\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.459322 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.459158 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldzh\" (UniqueName: \"kubernetes.io/projected/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-kube-api-access-6ldzh\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.459322 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.459227 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-proc\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.459322 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.459268 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-podres\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.459475 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.459392 2610 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-sys\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.559907 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.559878 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-proc\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.559907 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.559916 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-podres\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.560146 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.559970 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-sys\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.560146 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.560007 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-proc\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.560146 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.560045 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-lib-modules\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.560146 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.560074 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-sys\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.560146 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.560091 2610 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldzh\" (UniqueName: \"kubernetes.io/projected/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-kube-api-access-6ldzh\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.560146 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.560104 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-podres\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.560427 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.560152 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-lib-modules\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.568634 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.568602 2610 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldzh\" (UniqueName: \"kubernetes.io/projected/3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd-kube-api-access-6ldzh\") pod \"perf-node-gather-daemonset-9vqgf\" (UID: \"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.575591 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.575557 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-c5gp2_8033b471-ca39-425f-9cbb-cf56b370a5a2/networking-console-plugin/0.log" Apr 21 15:38:38.661296 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.661205 2610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:38.812840 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:38.812714 2610 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf"] Apr 21 15:38:38.815611 ip-10-0-129-133 kubenswrapper[2610]: W0421 15:38:38.815557 2610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3ec5c19f_37f9_4d06_996b_b1d3f2a27ddd.slice/crio-0ff2317fc6f6a50cf1483f534399045a089e3b8618942e8e025a4645148db518 WatchSource:0}: Error finding container 0ff2317fc6f6a50cf1483f534399045a089e3b8618942e8e025a4645148db518: Status 404 returned error can't find the container with id 0ff2317fc6f6a50cf1483f534399045a089e3b8618942e8e025a4645148db518 Apr 21 15:38:39.140474 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:39.140437 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" event={"ID":"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd","Type":"ContainerStarted","Data":"5acbdd258ef42f58204fd354a181542a5b838dff1f553ef6ac27992ccb355591"} Apr 21 15:38:39.140474 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:39.140474 2610 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" event={"ID":"3ec5c19f-37f9-4d06-996b-b1d3f2a27ddd","Type":"ContainerStarted","Data":"0ff2317fc6f6a50cf1483f534399045a089e3b8618942e8e025a4645148db518"} Apr 21 15:38:39.140742 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:39.140612 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:39.159395 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:39.159342 2610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" podStartSLOduration=1.1593262850000001 podStartE2EDuration="1.159326285s" podCreationTimestamp="2026-04-21 15:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:38:39.157798507 +0000 UTC m=+2583.141690557" watchObservedRunningTime="2026-04-21 15:38:39.159326285 +0000 UTC m=+2583.143218351" Apr 21 15:38:39.170863 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:39.170823 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/2.log" Apr 21 15:38:39.176113 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:39.176076 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2k25j_d982beb0-1451-48ab-b61a-060b6d23cfc7/console-operator/3.log" Apr 21 15:38:40.157044 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:40.156994 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-v7qpf_0de3ebe8-149d-4997-8a88-c28ce1dbe39d/volume-data-source-validator/0.log" Apr 21 15:38:40.896530 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:40.896505 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mhp6p_17ba6101-b1f6-412d-b361-2276f610226b/dns/0.log" Apr 21 15:38:40.918453 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:40.918428 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mhp6p_17ba6101-b1f6-412d-b361-2276f610226b/kube-rbac-proxy/0.log" Apr 21 15:38:41.090483 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:41.090453 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n8qc9_b48f3832-4ecd-46ba-bde8-35a4180bf3ca/dns-node-resolver/0.log" Apr 21 15:38:41.670096 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:41.670065 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z6hwp_72237d81-3f9e-4b04-a299-0acb0dd6604c/node-ca/0.log" Apr 21 15:38:42.622115 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:42.622080 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5bcc894b57-6xj5p_0f85023e-bf45-4798-9875-d1af959fe30c/kube-auth-proxy/0.log" Apr 21 15:38:42.741432 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:42.741401 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76d7d6f776-nvlj4_e22e5723-18d9-4194-867b-028f5e78e14d/router/0.log" Apr 21 15:38:43.324505 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:43.324475 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6vmnw_40ebea87-6126-42fa-bcf7-027f7fbce419/serve-healthcheck-canary/0.log" Apr 21 15:38:43.892845 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:43.892799 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-fnrh5_e3d66dda-7e69-48a9-a23b-ca9cdad31f2b/insights-operator/0.log" Apr 21 15:38:43.893691 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:43.893661 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-fnrh5_e3d66dda-7e69-48a9-a23b-ca9cdad31f2b/insights-operator/1.log" Apr 21 15:38:44.139309 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:44.139285 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bbvdg_bb217948-19df-46bd-9ef3-5c07750c4e03/kube-rbac-proxy/0.log" Apr 21 15:38:44.202835 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:44.202755 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bbvdg_bb217948-19df-46bd-9ef3-5c07750c4e03/exporter/0.log" Apr 21 15:38:44.256300 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:44.256274 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bbvdg_bb217948-19df-46bd-9ef3-5c07750c4e03/extractor/0.log" Apr 21 15:38:45.157011 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:45.156981 2610 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-9vqgf" Apr 21 15:38:47.183462 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:47.183431 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cfc874c8f-jck6m_4b20329a-2270-4e86-9339-3f99d193e016/manager/0.log" Apr 21 15:38:47.542189 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:47.542159 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-wh579_3f6c0ccf-bc8a-467c-b976-97198d25282c/postgres/0.log" Apr 21 15:38:49.319613 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:49.319566 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-586c4cccd6-kq2hz_62cf4840-d50f-418c-8cf6-52fb90e36787/manager/0.log" Apr 21 15:38:49.386754 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:49.386727 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-qzsjr_6fc4122e-7ed8-4da1-8ee2-8373944c2cbc/openshift-lws-operator/0.log" Apr 21 15:38:54.488527 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:54.488500 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wjc6n_9c75de96-f2ee-425b-89e6-419195efd0a8/migrator/0.log" Apr 21 15:38:54.554091 ip-10-0-129-133 kubenswrapper[2610]: I0421 15:38:54.554060 2610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wjc6n_9c75de96-f2ee-425b-89e6-419195efd0a8/graceful-termination/0.log"