Apr 17 17:23:59.484403 ip-10-0-136-202 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:24:00.000216 ip-10-0-136-202 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:00.000216 ip-10-0-136-202 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:24:00.000216 ip-10-0-136-202 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:00.000216 ip-10-0-136-202 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:24:00.000216 ip-10-0-136-202 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:00.002934 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.002861 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:24:00.006600 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006586 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:00.006600 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006599 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006604 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006607 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006610 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006613 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006616 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006619 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006623 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006627 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006631 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006642 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006645 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006648 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006651 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006653 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006656 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006660 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006664 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006667 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:00.006674 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006670 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006673 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006676 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006679 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006683 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006686 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006689 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006692 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006694 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006697 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006700 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006702 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006705 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006708 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006711 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006714 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006716 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006718 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006721 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006724 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:00.007123 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006726 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006729 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006731 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006734 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006736 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006739 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006741 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006744 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006746 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006749 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006751 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006753 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006756 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006758 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006761 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006764 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006767 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006770 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006773 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006776 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:00.007624 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006778 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006781 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006784 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006786 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006789 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006792 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006795 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006797 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006800 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006802 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006805 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006807 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006810 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006813 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006815 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006818 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006820 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006822 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006825 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006827 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:00.008200 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006830 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006832 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006835 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006837 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006840 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.006843 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007228 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007234 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007237 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007239 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007242 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007245 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007248 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007251 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007253 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007256 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007258 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007261 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007263 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007266 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:00.008671 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007268 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007271 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007273 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007276 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007280 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007284 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007286 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007289 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007291 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007294 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007296 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007299 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007302 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007304 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007306 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007309 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007312 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007315 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007318 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:00.009163 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007321 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007324 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007326 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007328 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007331 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007334 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007336 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007339 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007341 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007344 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007346 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007349 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007351 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007353 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007356 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007358 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007361 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007363 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007367 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:00.009676 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007371 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007374 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007377 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007379 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007382 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007385 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007388 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007391 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007393 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007396 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007399 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007402 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007405 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007414 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007417 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007419 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007422 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007424 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007427 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007430 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:00.010156 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007432 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007436 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007439 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007441 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007444 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007446 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007449 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007451 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007454 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007457 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007459 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007461 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007464 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.007466 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008917 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008926 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008937 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008941 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008945 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008949 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008953 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:24:00.010637 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008957 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008960 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008964 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008967 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008971 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008974 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008977 2570 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008980 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008983 2570 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008985 2570 flags.go:64] FLAG: --cloud-config="" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008989 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008993 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.008997 2570 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009000 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009003 2570 flags.go:64] FLAG: --config-dir="" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009006 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009009 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009013 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009016 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009019 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009022 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009025 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009028 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009031 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009034 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:24:00.011157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009037 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009041 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009045 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009048 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009051 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009054 2570 flags.go:64] FLAG: --enable-server="true" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009057 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009060 2570 flags.go:64] FLAG: --event-burst="100" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009064 2570 flags.go:64] FLAG: --event-qps="50" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009067 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009070 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009073 2570 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009076 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009079 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009082 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009085 2570 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009088 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009091 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009094 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009098 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009100 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009103 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009106 2570 flags.go:64] FLAG: --feature-gates="" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009109 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009112 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:24:00.011762 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009115 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009118 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009121 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009125 2570 flags.go:64] FLAG: --help="false" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009128 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009131 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009133 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009152 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009155 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009159 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009162 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009165 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009168 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009171 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009174 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009177 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009181 2570 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009184 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009186 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009190 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009193 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009195 2570 flags.go:64] FLAG: --lock-file="" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009198 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009201 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:24:00.012367 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009204 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009209 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009212 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009216 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009218 2570 flags.go:64] FLAG: --logging-format="text" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009221 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009224 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009227 2570 flags.go:64] FLAG: --manifest-url="" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009230 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009234 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009237 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009241 2570 flags.go:64] FLAG: --max-pods="110" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009244 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009247 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009250 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009253 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009256 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009259 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009262 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009269 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009272 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009275 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009278 2570 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:24:00.012944 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009281 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009287 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009290 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009294 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009296 2570 flags.go:64] FLAG: --port="10250" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009299 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009302 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02718eaf05f91d9c2" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009305 2570 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009308 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009311 2570 flags.go:64] FLAG: --register-node="true" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009314 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009317 2570 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009320 2570 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009329 2570 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009332 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009335 2570 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009339 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009342 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009345 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009347 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009350 2570 flags.go:64] FLAG: --runonce="false" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009353 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009356 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009359 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009362 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009365 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:24:00.013528 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009368 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009371 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009374 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009377 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009379 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009382 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009384 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009387 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009391 2570 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009394 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009399 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009402 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009405 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009410 2570 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009413 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009415 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009418 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009421 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009424 2570 flags.go:64] FLAG: --v="2" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009428 2570 flags.go:64] FLAG: --version="false" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009433 2570 flags.go:64] FLAG: --vmodule="" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009442 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.009447 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009549 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:00.014133 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009553 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009556 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009558 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009561 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009564 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009567 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009571 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009576 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009579 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009582 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009584 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009588 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009590 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009593 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009596 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009598 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009601 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009604 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009606 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:00.014764 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009609 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009611 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009613 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009616 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009618 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009621 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009623 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009626 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009629 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009633 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009637 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009640 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009643 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009646 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009649 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009652 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009655 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009657 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009659 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:00.015246 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009668 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009675 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009678 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009681 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009683 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009686 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009689 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009692 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009694 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009697 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009700 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009702 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009705 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009707 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009710 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009713 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009715 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009717 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009720 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009723 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:00.015706 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009725 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009728 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009730 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009732 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009743 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009746 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009748 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009751 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009753 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009755 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009758 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009760 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009763 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009767 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009769 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009772 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009774 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009777 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009779 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009782 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:00.016220 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009784 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:00.016727 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009788 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:00.016727 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009791 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:00.016727 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009793 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:00.016727 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009796 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:00.016727 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009799 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:00.016727 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.009801 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:00.016727 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.010480 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:00.020414 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.020308 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:24:00.020414 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.020413 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020459 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020465 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020469 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020472 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020475 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020478 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020482 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020485 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020488 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020490 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020493 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020496 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020499 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020502 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020505 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020508 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020510 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020513 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020516 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:00.020535 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020519 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020522 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020525 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020527 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020530 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020533 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020535 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020538 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020541 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020544 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020547 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020550 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020553 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020555 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020558 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020561 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020564 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020566 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020570 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:00.021032 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020574 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020577 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020580 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020583 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020585 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020588 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020591 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020593 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020596 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020599 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020601 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020604 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020607 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020609 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020612 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020616 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020619 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020623 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020627 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020630 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:00.021532 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020633 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020636 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020640 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020643 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020645 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020648 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020651 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020654 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020657 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020660 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020662 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020665 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020668 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020670 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020672 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020675 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020678 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020680 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020690 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020694 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:00.022049 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020697 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020699 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020702 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020705 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020707 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020710 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020713 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020716 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.020722 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020812 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020817 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020820 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020822 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020825 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020828 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:00.022551 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020831 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020833 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020836 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020839 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020841 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020844 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020846 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020849 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020851 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020854 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020856 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020859 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020861 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020864 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020866 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020869 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020878 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020881 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020883 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020886 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:00.022924 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020888 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020891 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020893 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020896 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020899 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020902 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020905 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020907 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020910 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020913 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020915 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020918 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020920 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020922 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020925 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020928 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020930 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020933 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020935 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020938 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:00.023433 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020940 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020943 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020945 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020948 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020951 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020953 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020956 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020958 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020961 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020969 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020972 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020974 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020976 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020979 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020981 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020984 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020987 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020990 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020993 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:00.023922 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020995 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.020999 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021003 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021006 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021009 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021012 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021014 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021017 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021021 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021025 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021027 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021030 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021032 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021035 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021038 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021040 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021043 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021046 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021048 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:00.024419 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021051 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:00.024908 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:00.021053 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:00.024908 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.021059 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:00.024908 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.021801 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:24:00.024908 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.024452 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:24:00.025433 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.025421 2570 server.go:1019] "Starting client certificate rotation" Apr 17 17:24:00.025545 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.025531 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:00.025581 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.025569 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:00.053739 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.053719 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:00.058476 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.058454 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:00.070691 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.070677 2570 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:24:00.077206 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.077191 2570 log.go:25] "Validated CRI v1 image API" Apr 17 17:24:00.078834 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.078816 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:24:00.083606 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.083581 2570 fs.go:135] Filesystem UUIDs: map[22a0dd81-cfee-4034-ba32-e173d3d661d9:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 da164c3f-3c7e-471d-96f6-cce99661565a:/dev/nvme0n1p4] Apr 17 17:24:00.083606 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.083601 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:24:00.085198 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.085181 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:00.089132 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.089026 2570 manager.go:217] Machine: {Timestamp:2026-04-17 17:24:00.087826447 +0000 UTC m=+0.467170251 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097077 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26bbd6d30bbcfbd45024498b57a453 SystemUUID:ec26bbd6-d30b-bcfb-d450-24498b57a453 BootID:a4edaa35-8e55-4743-a6f2-0e0d02c094f5 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:94:4e:b8:e0:0f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:94:4e:b8:e0:0f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:02:7b:8d:28:f9:51 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:24:00.089132 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.089126 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:24:00.089250 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.089233 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:24:00.089947 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.089913 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:24:00.090340 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.089945 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-202.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:24:00.090439 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.090355 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:24:00.090439 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.090369 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:24:00.090439 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.090388 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:00.090439 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.090425 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:00.091991 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.091976 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:00.092127 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.092117 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:24:00.094897 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.094887 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:24:00.094951 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.094900 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:24:00.095626 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.095618 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:24:00.095664 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.095630 2570 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:24:00.095664 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.095638 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:24:00.096639 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.096628 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:00.096686 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.096646 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:00.099978 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.099961 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:24:00.100613 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.100598 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7q7fn" Apr 17 17:24:00.101443 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.101430 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:24:00.103207 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103195 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:24:00.103254 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103213 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:24:00.103254 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103219 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:24:00.103254 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103225 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:24:00.103254 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103231 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:24:00.103254 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103237 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:24:00.103254 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103243 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:24:00.103254 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103248 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:24:00.103254 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103254 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:24:00.103457 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103262 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:24:00.103457 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103270 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:24:00.103457 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.103279 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:24:00.105737 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.105724 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:24:00.105786 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.105740 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:24:00.105878 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.105861 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-202.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:24:00.106006 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.105987 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:24:00.108204 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.108188 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-202.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:24:00.108890 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.108877 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:24:00.108931 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.108913 2570 server.go:1295] "Started kubelet" Apr 17 17:24:00.109641 ip-10-0-136-202 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:24:00.110226 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.109694 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:24:00.110279 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.110248 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:24:00.110739 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.110721 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7q7fn" Apr 17 17:24:00.110821 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.110716 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:24:00.113021 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.113004 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:24:00.113091 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.113012 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:24:00.118176 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.118158 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:00.119658 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.119520 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:24:00.119658 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.119576 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:24:00.120275 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120249 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:24:00.120343 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120279 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:24:00.120422 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120365 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:24:00.120422 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120389 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:24:00.120422 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120399 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:24:00.120568 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120499 2570 factory.go:55] Registering systemd factory Apr 17 17:24:00.120568 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120514 2570 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:24:00.120684 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.120666 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.120799 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120780 2570 factory.go:153] Registering CRI-O factory Apr 17 17:24:00.120799 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120798 2570 factory.go:223] Registration of the crio container factory successfully Apr 17 17:24:00.120898 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120852 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:24:00.120898 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120872 2570 factory.go:103] Registering Raw factory Apr 17 17:24:00.120898 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.120885 2570 manager.go:1196] Started watching for new ooms in manager Apr 17 17:24:00.121346 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.121323 2570 manager.go:319] Starting recovery of all containers Apr 17 17:24:00.122511 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.122493 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:00.125667 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.125645 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-202.ec2.internal\" not found" node="ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.134004 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.133988 2570 manager.go:324] Recovery completed Apr 17 17:24:00.137752 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.137740 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:00.140747 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.140730 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:00.140819 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.140755 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:00.140819 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.140765 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:00.141225 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.141213 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:24:00.141225 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.141223 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:24:00.141303 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.141237 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:00.143221 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.143210 2570 policy_none.go:49] "None policy: Start" Apr 17 17:24:00.143266 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.143225 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:24:00.143266 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.143243 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:24:00.178677 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.178664 2570 manager.go:341] "Starting Device Plugin manager" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.178710 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.178722 2570 server.go:85] "Starting device plugin registration server" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.178906 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.178916 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.178982 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.179053 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.179061 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.179486 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:24:00.191213 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.179518 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.247683 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.247661 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:24:00.248864 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.248847 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:24:00.248926 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.248882 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:24:00.248926 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.248900 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:24:00.248926 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.248908 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:24:00.249059 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.248972 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:24:00.251293 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.251249 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:00.279368 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.279350 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:00.280065 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.280046 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:00.280132 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.280077 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:00.280132 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.280092 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:00.280132 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.280111 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.289482 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.289468 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.289534 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.289488 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-202.ec2.internal\": node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.302664 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.302646 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.349476 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.349448 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal"] Apr 17 17:24:00.349540 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.349517 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:00.350360 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.350346 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:00.350423 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.350371 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:00.350423 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.350381 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:00.351553 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.351541 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:00.351671 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.351656 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.351721 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.351680 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:00.352206 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.352190 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:00.352274 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.352220 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:00.352274 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.352230 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:00.352345 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.352192 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:00.352345 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.352295 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:00.352345 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.352308 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:00.353245 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.353232 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.353294 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.353255 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:00.353838 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.353815 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:00.353920 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.353842 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:00.353920 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.353856 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:00.380785 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.380762 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-202.ec2.internal\" not found" node="ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.384936 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.384921 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-202.ec2.internal\" not found" node="ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.403670 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.403651 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.421741 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.421725 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/697494af72639fc0eb382c71525a5808-config\") pod \"kube-apiserver-proxy-ip-10-0-136-202.ec2.internal\" (UID: \"697494af72639fc0eb382c71525a5808\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.421811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.421755 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5edfcc530d2d3e0bfbb4e13a5b40905e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal\" (UID: \"5edfcc530d2d3e0bfbb4e13a5b40905e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.421811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.421771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5edfcc530d2d3e0bfbb4e13a5b40905e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal\" (UID: \"5edfcc530d2d3e0bfbb4e13a5b40905e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.504581 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.504537 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.522209 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.522192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/697494af72639fc0eb382c71525a5808-config\") pod \"kube-apiserver-proxy-ip-10-0-136-202.ec2.internal\" (UID: \"697494af72639fc0eb382c71525a5808\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.522290 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.522219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5edfcc530d2d3e0bfbb4e13a5b40905e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal\" (UID: \"5edfcc530d2d3e0bfbb4e13a5b40905e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.522290 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.522254 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5edfcc530d2d3e0bfbb4e13a5b40905e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal\" (UID: \"5edfcc530d2d3e0bfbb4e13a5b40905e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.522391 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.522299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5edfcc530d2d3e0bfbb4e13a5b40905e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal\" (UID: \"5edfcc530d2d3e0bfbb4e13a5b40905e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.522391 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.522306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/697494af72639fc0eb382c71525a5808-config\") pod \"kube-apiserver-proxy-ip-10-0-136-202.ec2.internal\" (UID: \"697494af72639fc0eb382c71525a5808\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.522391 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.522370 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5edfcc530d2d3e0bfbb4e13a5b40905e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal\" (UID: \"5edfcc530d2d3e0bfbb4e13a5b40905e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.605592 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.605555 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.683015 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.682998 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.687523 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.687506 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:00.706261 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.706234 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.806774 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.806723 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.907197 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:00.907174 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-202.ec2.internal\" not found" Apr 17 17:24:00.963178 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:00.963136 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:01.019839 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.019805 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" Apr 17 17:24:01.026118 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.026103 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:24:01.026240 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.026223 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:01.026298 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.026271 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:01.026298 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.026271 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:01.026388 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.026314 2570 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a9dfb5aa8b82f4eabbf5a28ef7c0977d-5ca589ec8d662dce.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.136.202:56966->23.22.109.228:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" Apr 17 17:24:01.026388 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.026351 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" Apr 17 17:24:01.043635 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.043611 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:01.096170 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.096150 2570 apiserver.go:52] "Watching apiserver" Apr 17 17:24:01.102982 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.102965 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:24:01.105115 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.105087 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-b6njn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal","openshift-multus/multus-additional-cni-plugins-j4mx4","openshift-multus/multus-xtx74","openshift-ovn-kubernetes/ovnkube-node-42tv5","kube-system/konnectivity-agent-n6fpq","kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal","openshift-image-registry/node-ca-ks8m2","openshift-multus/network-metrics-daemon-zcn2s","openshift-network-diagnostics/network-check-target-xbzwj","openshift-network-operator/iptables-alerter-gbft6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb","openshift-cluster-node-tuning-operator/tuned-426x5"] Apr 17 17:24:01.106493 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.106466 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:01.106579 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.106557 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:01.107924 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.107909 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.111245 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.108787 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.111245 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.110278 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:24:01.111245 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.110433 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:24:01.111245 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.110472 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:24:01.111245 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.110521 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:24:01.111245 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.110600 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:24:01.111608 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.111304 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vktcn\"" Apr 17 17:24:01.111608 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.111519 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.111708 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.111609 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:24:01.112494 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.112475 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-45gx9\"" Apr 17 17:24:01.113157 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.113123 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:01.113751 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.113735 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:24:01.113823 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.113767 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:24:01.114043 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.114009 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:19:00 +0000 UTC" deadline="2027-09-25 04:30:50.429122943 +0000 UTC" Apr 17 17:24:01.114104 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.114043 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12611h6m49.315083197s" Apr 17 17:24:01.114177 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.114130 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:24:01.114310 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.114297 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.115072 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.115052 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:24:01.115190 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.115075 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:24:01.115190 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.115081 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:24:01.115190 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.115091 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pkz6h\"" Apr 17 17:24:01.115372 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.115220 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:24:01.115461 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.115447 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:01.115523 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.115507 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:01.115729 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.115715 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:24:01.115773 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.115717 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jvc75\"" Apr 17 17:24:01.116520 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.116502 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pgjpq\"" Apr 17 17:24:01.116604 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.116507 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.116679 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.116662 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:24:01.116736 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.116668 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:24:01.116970 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.116957 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:24:01.117623 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.117609 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.118498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.118485 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:01.118755 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.118742 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.119782 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.119763 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.121184 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.121168 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:24:01.123007 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.122989 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:01.123105 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123088 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:01.123198 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123033 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:24:01.123262 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123063 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7kqdp\"" Apr 17 17:24:01.123262 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.122998 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:24:01.123539 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123521 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:24:01.123616 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123547 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-cwrgc\"" Apr 17 17:24:01.123616 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123520 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:24:01.123616 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123606 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:01.123778 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123623 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9zw8g\"" Apr 17 17:24:01.123888 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123873 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:24:01.123969 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123875 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:24:01.124023 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123877 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:01.124073 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.123879 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q47cq\"" Apr 17 17:24:01.125594 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125572 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdj26\" (UniqueName: \"kubernetes.io/projected/e75ff2ec-f311-467f-a5b6-86322293f3ed-kube-api-access-mdj26\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.125673 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125608 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysconfig\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.125673 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125632 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysctl-d\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.125673 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125653 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-systemd\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.125792 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125677 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.125792 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125701 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-node-log\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.125792 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125722 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-netns\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.125792 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-modprobe-d\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.125792 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-host\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.125960 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6t7\" (UniqueName: \"kubernetes.io/projected/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-kube-api-access-5m6t7\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.125960 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e75ff2ec-f311-467f-a5b6-86322293f3ed-hosts-file\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.125960 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125840 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-sys\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.125960 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125854 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-ovnkube-config\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.125960 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f3a808b-95e3-410b-bcf1-257bf1254f04-cni-binary-copy\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.125960 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125880 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-ovnkube-script-lib\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.125960 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-hostroot\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.125954 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8k9\" (UniqueName: \"kubernetes.io/projected/9f3a808b-95e3-410b-bcf1-257bf1254f04-kube-api-access-6d8k9\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126000 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1f813146-daee-4a79-a436-af839213c97f-etc-tuned\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126020 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f813146-daee-4a79-a436-af839213c97f-tmp\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126037 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-system-cni-dir\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126051 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-run-ovn-kubernetes\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126066 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8947a5af-e839-4e78-8aef-37f0885ea400-serviceca\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126080 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87n25\" (UniqueName: \"kubernetes.io/projected/82dffe03-2b0c-4ac4-bb02-5a8430704805-kube-api-access-87n25\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126096 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-system-cni-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-k8s-cni-cncf-io\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfs9\" (UniqueName: \"kubernetes.io/projected/1f813146-daee-4a79-a436-af839213c97f-kube-api-access-mgfs9\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126138 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-slash\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126171 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7683382f-975a-43b6-9d75-7d14283c2328-host-slash\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126185 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-registration-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.126196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126198 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126211 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-etc-kubernetes\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-os-release\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126261 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w69t5\" (UniqueName: \"kubernetes.io/projected/02eb784a-744c-4eac-bec8-46a3df516ca9-kube-api-access-w69t5\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126297 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-etc-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126324 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-cnibin\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126343 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-conf-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126358 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-var-lib-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8947a5af-e839-4e78-8aef-37f0885ea400-host\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126411 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126432 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-sys-fs\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126467 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-kubelet\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-systemd-units\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126508 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-systemd\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126522 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-cni-netd\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.126730 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126536 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb68dc81-dce9-4f3e-bf24-94eff6bc04bf-agent-certs\") pod \"konnectivity-agent-n6fpq\" (UID: \"eb68dc81-dce9-4f3e-bf24-94eff6bc04bf\") " pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126549 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-device-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126586 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-kubelet\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126622 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-multus-certs\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126650 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysctl-conf\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126676 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-lib-modules\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126717 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126743 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2c2\" (UniqueName: \"kubernetes.io/projected/0a83ec74-42da-427f-be72-02e777b9626c-kube-api-access-gh2c2\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb68dc81-dce9-4f3e-bf24-94eff6bc04bf-konnectivity-ca\") pod \"konnectivity-agent-n6fpq\" (UID: \"eb68dc81-dce9-4f3e-bf24-94eff6bc04bf\") " pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-os-release\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126815 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-run-netns\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-log-socket\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a83ec74-42da-427f-be72-02e777b9626c-ovn-node-metrics-cert\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126895 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnqk\" (UniqueName: \"kubernetes.io/projected/8947a5af-e839-4e78-8aef-37f0885ea400-kube-api-access-rgnqk\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-socket-dir-parent\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-cnibin\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.127301 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126955 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.126969 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-env-overrides\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127032 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7683382f-975a-43b6-9d75-7d14283c2328-iptables-alerter-script\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-cni-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127091 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-run\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-var-lib-kubelet\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127175 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-cni-bin\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127203 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7zp\" (UniqueName: \"kubernetes.io/projected/7683382f-975a-43b6-9d75-7d14283c2328-kube-api-access-ch7zp\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127240 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-socket-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127264 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-daemon-config\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127287 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-cni-binary-copy\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127314 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127336 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-cni-bin\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127367 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-cni-multus\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127382 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e75ff2ec-f311-467f-a5b6-86322293f3ed-tmp-dir\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127403 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-kubernetes\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.127811 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127424 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-ovn\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.128303 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127449 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-etc-selinux\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.128303 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.127484 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:01.134090 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.134073 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:01.157405 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.157387 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tpjqk" Apr 17 17:24:01.165843 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.165822 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tpjqk" Apr 17 17:24:01.183498 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.183474 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod697494af72639fc0eb382c71525a5808.slice/crio-f93b50ee91b504583a19843cfe1bd0fa495e7e7bd7b3874daed15abff997d174 WatchSource:0}: Error finding container f93b50ee91b504583a19843cfe1bd0fa495e7e7bd7b3874daed15abff997d174: Status 404 returned error can't find the container with id f93b50ee91b504583a19843cfe1bd0fa495e7e7bd7b3874daed15abff997d174 Apr 17 17:24:01.183986 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.183960 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5edfcc530d2d3e0bfbb4e13a5b40905e.slice/crio-f7bc47e1af9e8cec699e5c8aedef255fb4276671c757c1d7acf6ea3f69cbe922 WatchSource:0}: Error finding container f7bc47e1af9e8cec699e5c8aedef255fb4276671c757c1d7acf6ea3f69cbe922: Status 404 returned error can't find the container with id f7bc47e1af9e8cec699e5c8aedef255fb4276671c757c1d7acf6ea3f69cbe922 Apr 17 17:24:01.189556 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.189542 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:24:01.227676 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-os-release\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.227758 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-run-netns\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.227758 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227696 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-log-socket\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.227758 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a83ec74-42da-427f-be72-02e777b9626c-ovn-node-metrics-cert\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.227758 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnqk\" (UniqueName: \"kubernetes.io/projected/8947a5af-e839-4e78-8aef-37f0885ea400-kube-api-access-rgnqk\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227779 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-os-release\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227781 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-log-socket\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227805 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-run-netns\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227854 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-socket-dir-parent\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227884 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-cnibin\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227907 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-socket-dir-parent\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227951 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-cnibin\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-env-overrides\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228000 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7683382f-975a-43b6-9d75-7d14283c2328-iptables-alerter-script\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228020 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-cni-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.228034 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.227971 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228050 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-run\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228058 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-cni-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228066 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228092 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-run\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-var-lib-kubelet\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228131 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-cni-bin\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228135 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-var-lib-kubelet\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7zp\" (UniqueName: \"kubernetes.io/projected/7683382f-975a-43b6-9d75-7d14283c2328-kube-api-access-ch7zp\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228182 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-socket-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-cni-bin\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-socket-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-daemon-config\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228345 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-cni-binary-copy\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-cni-bin\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-env-overrides\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-cni-multus\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.228599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228486 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e75ff2ec-f311-467f-a5b6-86322293f3ed-tmp-dir\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228511 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-kubernetes\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228535 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-ovn\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228559 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-etc-selinux\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228602 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-kubernetes\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdj26\" (UniqueName: \"kubernetes.io/projected/e75ff2ec-f311-467f-a5b6-86322293f3ed-kube-api-access-mdj26\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysconfig\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228666 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysctl-d\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-systemd\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228738 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-node-log\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-netns\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-cni-bin\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228785 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-modprobe-d\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-host\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysconfig\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.229397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228828 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6t7\" (UniqueName: \"kubernetes.io/projected/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-kube-api-access-5m6t7\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e75ff2ec-f311-467f-a5b6-86322293f3ed-tmp-dir\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228872 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e75ff2ec-f311-467f-a5b6-86322293f3ed-hosts-file\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228556 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7683382f-975a-43b6-9d75-7d14283c2328-iptables-alerter-script\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228885 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228893 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-ovn\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228896 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-sys\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228937 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-etc-selinux\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228945 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-ovnkube-config\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228937 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-netns\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-cni-multus\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f3a808b-95e3-410b-bcf1-257bf1254f04-cni-binary-copy\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.228980 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228991 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysctl-d\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228999 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-ovnkube-script-lib\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.229011 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e75ff2ec-f311-467f-a5b6-86322293f3ed-hosts-file\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.229040 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-host\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.230093 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.229068 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:01.729013817 +0000 UTC m=+2.108357611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:01.230825 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.229074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.230825 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.229089 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-hostroot\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.230825 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228872 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-daemon-config\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.230825 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.228872 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-cni-binary-copy\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.230825 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.229432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-node-log\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.230825 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.229699 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-ovnkube-config\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.230825 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.230106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f3a808b-95e3-410b-bcf1-257bf1254f04-cni-binary-copy\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.231844 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.231817 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8k9\" (UniqueName: \"kubernetes.io/projected/9f3a808b-95e3-410b-bcf1-257bf1254f04-kube-api-access-6d8k9\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.231934 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.231886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-sys\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.231984 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.231934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1f813146-daee-4a79-a436-af839213c97f-etc-tuned\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.232304 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232285 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-systemd\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.232304 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232293 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-hostroot\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.232426 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f813146-daee-4a79-a436-af839213c97f-tmp\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.232426 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232343 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-modprobe-d\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.232426 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232348 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a83ec74-42da-427f-be72-02e777b9626c-ovn-node-metrics-cert\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.232426 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-system-cni-dir\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.232426 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232408 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-run-ovn-kubernetes\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.232633 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232428 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-system-cni-dir\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.232633 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232471 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8947a5af-e839-4e78-8aef-37f0885ea400-serviceca\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.232633 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232499 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-run-ovn-kubernetes\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.232633 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232500 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87n25\" (UniqueName: \"kubernetes.io/projected/82dffe03-2b0c-4ac4-bb02-5a8430704805-kube-api-access-87n25\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:01.232633 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232548 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-system-cni-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.232633 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232570 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-k8s-cni-cncf-io\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.232633 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfs9\" (UniqueName: \"kubernetes.io/projected/1f813146-daee-4a79-a436-af839213c97f-kube-api-access-mgfs9\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.232633 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232618 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-slash\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7683382f-975a-43b6-9d75-7d14283c2328-host-slash\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232650 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-system-cni-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232666 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-registration-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232691 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-k8s-cni-cncf-io\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a83ec74-42da-427f-be72-02e777b9626c-ovnkube-script-lib\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-etc-kubernetes\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-os-release\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-registration-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232771 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w69t5\" (UniqueName: \"kubernetes.io/projected/02eb784a-744c-4eac-bec8-46a3df516ca9-kube-api-access-w69t5\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-slash\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232799 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-etc-kubernetes\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-etc-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232831 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-etc-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-cnibin\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232852 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7683382f-975a-43b6-9d75-7d14283c2328-host-slash\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.232967 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232867 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-conf-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-cnibin\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232915 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/02eb784a-744c-4eac-bec8-46a3df516ca9-os-release\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232927 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-multus-conf-dir\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.232976 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-var-lib-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8947a5af-e839-4e78-8aef-37f0885ea400-host\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233012 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-var-lib-openvswitch\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233029 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233053 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-sys-fs\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233053 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8947a5af-e839-4e78-8aef-37f0885ea400-host\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233077 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-kubelet\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233081 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233092 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-sys-fs\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-systemd-units\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-kubelet\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-systemd-units\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.233753 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-systemd\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233172 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-run-systemd\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-cni-netd\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb68dc81-dce9-4f3e-bf24-94eff6bc04bf-agent-certs\") pod \"konnectivity-agent-n6fpq\" (UID: \"eb68dc81-dce9-4f3e-bf24-94eff6bc04bf\") " pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233238 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-device-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-cni-netd\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-kubelet\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-multus-certs\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysctl-conf\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-var-lib-kubelet\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233336 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-lib-modules\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233373 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9f3a808b-95e3-410b-bcf1-257bf1254f04-host-run-multus-certs\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233405 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a83ec74-42da-427f-be72-02e777b9626c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2c2\" (UniqueName: \"kubernetes.io/projected/0a83ec74-42da-427f-be72-02e777b9626c-kube-api-access-gh2c2\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb68dc81-dce9-4f3e-bf24-94eff6bc04bf-konnectivity-ca\") pod \"konnectivity-agent-n6fpq\" (UID: \"eb68dc81-dce9-4f3e-bf24-94eff6bc04bf\") " pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/02eb784a-744c-4eac-bec8-46a3df516ca9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.234563 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233491 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-etc-sysctl-conf\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.235325 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233518 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8947a5af-e839-4e78-8aef-37f0885ea400-serviceca\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.235325 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233559 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f813146-daee-4a79-a436-af839213c97f-lib-modules\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.235325 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233561 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-device-dir\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.235325 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.233805 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb68dc81-dce9-4f3e-bf24-94eff6bc04bf-konnectivity-ca\") pod \"konnectivity-agent-n6fpq\" (UID: \"eb68dc81-dce9-4f3e-bf24-94eff6bc04bf\") " pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:01.235325 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.234357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1f813146-daee-4a79-a436-af839213c97f-etc-tuned\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.235325 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.234486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f813146-daee-4a79-a436-af839213c97f-tmp\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.235536 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.235520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb68dc81-dce9-4f3e-bf24-94eff6bc04bf-agent-certs\") pod \"konnectivity-agent-n6fpq\" (UID: \"eb68dc81-dce9-4f3e-bf24-94eff6bc04bf\") " pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:01.238477 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.238456 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:01.238560 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.238485 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:01.238560 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.238498 2570 projected.go:194] Error preparing data for projected volume kube-api-access-zjf97 for pod openshift-network-diagnostics/network-check-target-xbzwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:01.238676 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.238569 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97 podName:3b7788b5-7a25-42a0-8536-8a543283bb1e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:01.738552415 +0000 UTC m=+2.117896227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zjf97" (UniqueName: "kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97") pod "network-check-target-xbzwj" (UID: "3b7788b5-7a25-42a0-8536-8a543283bb1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:01.239663 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.239639 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7zp\" (UniqueName: \"kubernetes.io/projected/7683382f-975a-43b6-9d75-7d14283c2328-kube-api-access-ch7zp\") pod \"iptables-alerter-gbft6\" (UID: \"7683382f-975a-43b6-9d75-7d14283c2328\") " pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.239820 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.239791 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnqk\" (UniqueName: \"kubernetes.io/projected/8947a5af-e839-4e78-8aef-37f0885ea400-kube-api-access-rgnqk\") pod \"node-ca-ks8m2\" (UID: \"8947a5af-e839-4e78-8aef-37f0885ea400\") " pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.239890 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.239796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6t7\" (UniqueName: \"kubernetes.io/projected/61a26f4f-99b7-4f63-a4a2-b6fef61dff87-kube-api-access-5m6t7\") pod \"aws-ebs-csi-driver-node-hjzlb\" (UID: \"61a26f4f-99b7-4f63-a4a2-b6fef61dff87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.240168 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.240132 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdj26\" (UniqueName: \"kubernetes.io/projected/e75ff2ec-f311-467f-a5b6-86322293f3ed-kube-api-access-mdj26\") pod \"node-resolver-b6njn\" (UID: \"e75ff2ec-f311-467f-a5b6-86322293f3ed\") " pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.240946 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.240925 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87n25\" (UniqueName: \"kubernetes.io/projected/82dffe03-2b0c-4ac4-bb02-5a8430704805-kube-api-access-87n25\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:01.241023 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.240981 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfs9\" (UniqueName: \"kubernetes.io/projected/1f813146-daee-4a79-a436-af839213c97f-kube-api-access-mgfs9\") pod \"tuned-426x5\" (UID: \"1f813146-daee-4a79-a436-af839213c97f\") " pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.241385 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.241371 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8k9\" (UniqueName: \"kubernetes.io/projected/9f3a808b-95e3-410b-bcf1-257bf1254f04-kube-api-access-6d8k9\") pod \"multus-xtx74\" (UID: \"9f3a808b-95e3-410b-bcf1-257bf1254f04\") " pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.241741 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.241719 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2c2\" (UniqueName: \"kubernetes.io/projected/0a83ec74-42da-427f-be72-02e777b9626c-kube-api-access-gh2c2\") pod \"ovnkube-node-42tv5\" (UID: \"0a83ec74-42da-427f-be72-02e777b9626c\") " pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.241807 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.241723 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w69t5\" (UniqueName: \"kubernetes.io/projected/02eb784a-744c-4eac-bec8-46a3df516ca9-kube-api-access-w69t5\") pod \"multus-additional-cni-plugins-j4mx4\" (UID: \"02eb784a-744c-4eac-bec8-46a3df516ca9\") " pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.251386 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.251355 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" event={"ID":"697494af72639fc0eb382c71525a5808","Type":"ContainerStarted","Data":"f93b50ee91b504583a19843cfe1bd0fa495e7e7bd7b3874daed15abff997d174"} Apr 17 17:24:01.252212 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.252191 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" event={"ID":"5edfcc530d2d3e0bfbb4e13a5b40905e","Type":"ContainerStarted","Data":"f7bc47e1af9e8cec699e5c8aedef255fb4276671c757c1d7acf6ea3f69cbe922"} Apr 17 17:24:01.443327 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.443265 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" Apr 17 17:24:01.449952 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.449931 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02eb784a_744c_4eac_bec8_46a3df516ca9.slice/crio-516bb35b1f12701aae12093525753e648d453ed63c87a2232bb6d272b0a43b72 WatchSource:0}: Error finding container 516bb35b1f12701aae12093525753e648d453ed63c87a2232bb6d272b0a43b72: Status 404 returned error can't find the container with id 516bb35b1f12701aae12093525753e648d453ed63c87a2232bb6d272b0a43b72 Apr 17 17:24:01.463251 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.463234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xtx74" Apr 17 17:24:01.469465 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.469445 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3a808b_95e3_410b_bcf1_257bf1254f04.slice/crio-5229e619f8389d23be57b8e6ad5f7fcc8f2895b92ab542f84ccfaf75bd61b48f WatchSource:0}: Error finding container 5229e619f8389d23be57b8e6ad5f7fcc8f2895b92ab542f84ccfaf75bd61b48f: Status 404 returned error can't find the container with id 5229e619f8389d23be57b8e6ad5f7fcc8f2895b92ab542f84ccfaf75bd61b48f Apr 17 17:24:01.483113 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.483093 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:01.488656 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.488632 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a83ec74_42da_427f_be72_02e777b9626c.slice/crio-4bbabaace168371056edc147db40a152610f452e61be3672d210953a3b82fa65 WatchSource:0}: Error finding container 4bbabaace168371056edc147db40a152610f452e61be3672d210953a3b82fa65: Status 404 returned error can't find the container with id 4bbabaace168371056edc147db40a152610f452e61be3672d210953a3b82fa65 Apr 17 17:24:01.492526 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.492505 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:01.497493 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.497476 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ks8m2" Apr 17 17:24:01.501438 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.501417 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb68dc81_dce9_4f3e_bf24_94eff6bc04bf.slice/crio-23aaad017227958cb76ecb8e29976cb0d321a7d8c58405a96fef7dc0f5102626 WatchSource:0}: Error finding container 23aaad017227958cb76ecb8e29976cb0d321a7d8c58405a96fef7dc0f5102626: Status 404 returned error can't find the container with id 23aaad017227958cb76ecb8e29976cb0d321a7d8c58405a96fef7dc0f5102626 Apr 17 17:24:01.502727 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.502707 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b6njn" Apr 17 17:24:01.504429 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.504404 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8947a5af_e839_4e78_8aef_37f0885ea400.slice/crio-c20c64db9d9a6a2e9b97c2cd3112361853792469bdfee5f29be21acc844054f4 WatchSource:0}: Error finding container c20c64db9d9a6a2e9b97c2cd3112361853792469bdfee5f29be21acc844054f4: Status 404 returned error can't find the container with id c20c64db9d9a6a2e9b97c2cd3112361853792469bdfee5f29be21acc844054f4 Apr 17 17:24:01.509977 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.509956 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gbft6" Apr 17 17:24:01.510300 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.510280 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode75ff2ec_f311_467f_a5b6_86322293f3ed.slice/crio-0f0b4032c3e762fa026f4595f3b1d229d03929b108cd5e6e0639599e2db1cb6e WatchSource:0}: Error finding container 0f0b4032c3e762fa026f4595f3b1d229d03929b108cd5e6e0639599e2db1cb6e: Status 404 returned error can't find the container with id 0f0b4032c3e762fa026f4595f3b1d229d03929b108cd5e6e0639599e2db1cb6e Apr 17 17:24:01.514739 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.514721 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" Apr 17 17:24:01.516387 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.516362 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7683382f_975a_43b6_9d75_7d14283c2328.slice/crio-8194ea00b9343dc8e688ef9099fdb023e133fd696be72e66c06c1ce173e1aa3b WatchSource:0}: Error finding container 8194ea00b9343dc8e688ef9099fdb023e133fd696be72e66c06c1ce173e1aa3b: Status 404 returned error can't find the container with id 8194ea00b9343dc8e688ef9099fdb023e133fd696be72e66c06c1ce173e1aa3b Apr 17 17:24:01.518497 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.518478 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-426x5" Apr 17 17:24:01.523981 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.523959 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61a26f4f_99b7_4f63_a4a2_b6fef61dff87.slice/crio-804d38080e42e68f5b03994beb74a93ee0b6f8be81a70e9e3aecd5864245e938 WatchSource:0}: Error finding container 804d38080e42e68f5b03994beb74a93ee0b6f8be81a70e9e3aecd5864245e938: Status 404 returned error can't find the container with id 804d38080e42e68f5b03994beb74a93ee0b6f8be81a70e9e3aecd5864245e938 Apr 17 17:24:01.526903 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:24:01.526882 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f813146_daee_4a79_a436_af839213c97f.slice/crio-b5a5f706014c2df9a0b3ea34cdcca0a7173a5d2a2cb0c4e4c583a4e4210dae5a WatchSource:0}: Error finding container b5a5f706014c2df9a0b3ea34cdcca0a7173a5d2a2cb0c4e4c583a4e4210dae5a: Status 404 returned error can't find the container with id b5a5f706014c2df9a0b3ea34cdcca0a7173a5d2a2cb0c4e4c583a4e4210dae5a Apr 17 17:24:01.532131 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.531598 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:01.737980 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.737909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:01.738115 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.738074 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:01.738192 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.738138 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:02.738118352 +0000 UTC m=+3.117462146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:01.838589 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.838561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:01.838769 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.838746 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:01.838840 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.838779 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:01.838840 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.838794 2570 projected.go:194] Error preparing data for projected volume kube-api-access-zjf97 for pod openshift-network-diagnostics/network-check-target-xbzwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:01.838951 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:01.838868 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97 podName:3b7788b5-7a25-42a0-8536-8a543283bb1e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:02.838847844 +0000 UTC m=+3.218191648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zjf97" (UniqueName: "kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97") pod "network-check-target-xbzwj" (UID: "3b7788b5-7a25-42a0-8536-8a543283bb1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:01.969097 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:01.968809 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:02.166725 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.166644 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:01 +0000 UTC" deadline="2027-11-15 19:02:50.618647592 +0000 UTC" Apr 17 17:24:02.166725 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.166673 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13849h38m48.451977965s" Apr 17 17:24:02.251950 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.251500 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:02.279596 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.279560 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-n6fpq" event={"ID":"eb68dc81-dce9-4f3e-bf24-94eff6bc04bf","Type":"ContainerStarted","Data":"23aaad017227958cb76ecb8e29976cb0d321a7d8c58405a96fef7dc0f5102626"} Apr 17 17:24:02.293071 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.293040 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"4bbabaace168371056edc147db40a152610f452e61be3672d210953a3b82fa65"} Apr 17 17:24:02.304121 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.304094 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" event={"ID":"61a26f4f-99b7-4f63-a4a2-b6fef61dff87","Type":"ContainerStarted","Data":"804d38080e42e68f5b03994beb74a93ee0b6f8be81a70e9e3aecd5864245e938"} Apr 17 17:24:02.310716 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.310691 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b6njn" event={"ID":"e75ff2ec-f311-467f-a5b6-86322293f3ed","Type":"ContainerStarted","Data":"0f0b4032c3e762fa026f4595f3b1d229d03929b108cd5e6e0639599e2db1cb6e"} Apr 17 17:24:02.335755 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.335732 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xtx74" event={"ID":"9f3a808b-95e3-410b-bcf1-257bf1254f04","Type":"ContainerStarted","Data":"5229e619f8389d23be57b8e6ad5f7fcc8f2895b92ab542f84ccfaf75bd61b48f"} Apr 17 17:24:02.355712 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.355656 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" event={"ID":"02eb784a-744c-4eac-bec8-46a3df516ca9","Type":"ContainerStarted","Data":"516bb35b1f12701aae12093525753e648d453ed63c87a2232bb6d272b0a43b72"} Apr 17 17:24:02.363249 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.363221 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-426x5" event={"ID":"1f813146-daee-4a79-a436-af839213c97f","Type":"ContainerStarted","Data":"b5a5f706014c2df9a0b3ea34cdcca0a7173a5d2a2cb0c4e4c583a4e4210dae5a"} Apr 17 17:24:02.367134 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.367109 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gbft6" event={"ID":"7683382f-975a-43b6-9d75-7d14283c2328","Type":"ContainerStarted","Data":"8194ea00b9343dc8e688ef9099fdb023e133fd696be72e66c06c1ce173e1aa3b"} Apr 17 17:24:02.382830 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.382807 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ks8m2" event={"ID":"8947a5af-e839-4e78-8aef-37f0885ea400","Type":"ContainerStarted","Data":"c20c64db9d9a6a2e9b97c2cd3112361853792469bdfee5f29be21acc844054f4"} Apr 17 17:24:02.745567 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.745534 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:02.745730 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:02.745665 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:02.745730 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:02.745722 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:04.745704504 +0000 UTC m=+5.125048314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:02.847828 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:02.846529 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:02.847828 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:02.846791 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:02.847828 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:02.846817 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:02.847828 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:02.846831 2570 projected.go:194] Error preparing data for projected volume kube-api-access-zjf97 for pod openshift-network-diagnostics/network-check-target-xbzwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:02.847828 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:02.846915 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97 podName:3b7788b5-7a25-42a0-8536-8a543283bb1e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:04.846870911 +0000 UTC m=+5.226214703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zjf97" (UniqueName: "kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97") pod "network-check-target-xbzwj" (UID: "3b7788b5-7a25-42a0-8536-8a543283bb1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:03.167436 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:03.167396 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:01 +0000 UTC" deadline="2028-01-06 14:26:56.3899611 +0000 UTC" Apr 17 17:24:03.167436 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:03.167431 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15093h2m53.222533523s" Apr 17 17:24:03.183327 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:03.183305 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:03.249333 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:03.249310 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:03.249528 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:03.249503 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:03.250014 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:03.249993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:03.250110 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:03.250092 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:04.759927 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:04.759891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:04.760350 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:04.760077 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:04.760350 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:04.760163 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:08.760124933 +0000 UTC m=+9.139468746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:04.860983 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:04.860950 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:04.861172 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:04.861098 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:04.861172 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:04.861117 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:04.861172 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:04.861129 2570 projected.go:194] Error preparing data for projected volume kube-api-access-zjf97 for pod openshift-network-diagnostics/network-check-target-xbzwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:04.861330 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:04.861196 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97 podName:3b7788b5-7a25-42a0-8536-8a543283bb1e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:08.861178716 +0000 UTC m=+9.240522523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zjf97" (UniqueName: "kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97") pod "network-check-target-xbzwj" (UID: "3b7788b5-7a25-42a0-8536-8a543283bb1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:05.249876 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:05.249393 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:05.249876 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:05.249421 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:05.249876 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:05.249534 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:05.249876 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:05.249635 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:07.249906 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:07.249864 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:07.250365 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:07.249986 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:07.250441 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:07.250365 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:07.250520 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:07.250487 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:08.793290 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:08.793225 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:08.793741 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:08.793361 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:08.793741 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:08.793440 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:16.793416668 +0000 UTC m=+17.172760477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:08.894010 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:08.893971 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:08.894188 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:08.894171 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:08.894257 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:08.894197 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:08.894257 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:08.894211 2570 projected.go:194] Error preparing data for projected volume kube-api-access-zjf97 for pod openshift-network-diagnostics/network-check-target-xbzwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:08.894359 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:08.894273 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97 podName:3b7788b5-7a25-42a0-8536-8a543283bb1e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:16.894253345 +0000 UTC m=+17.273597158 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zjf97" (UniqueName: "kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97") pod "network-check-target-xbzwj" (UID: "3b7788b5-7a25-42a0-8536-8a543283bb1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:09.249694 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:09.249617 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:09.249694 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:09.249638 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:09.249898 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:09.249733 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:09.249898 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:09.249871 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:11.249523 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:11.249499 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:11.249900 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:11.249499 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:11.249900 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:11.249613 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:11.249900 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:11.249689 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:13.249236 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:13.249207 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:13.249732 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:13.249206 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:13.249732 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:13.249318 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:13.249732 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:13.249407 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:15.249078 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:15.249040 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:15.249568 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:15.249040 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:15.249568 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:15.249188 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:15.249568 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:15.249264 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:16.855993 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:16.855960 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:16.856462 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:16.856121 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:16.856462 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:16.856207 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:32.856186494 +0000 UTC m=+33.235530302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:16.957116 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:16.957081 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:16.957289 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:16.957232 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:16.957289 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:16.957251 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:16.957289 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:16.957264 2570 projected.go:194] Error preparing data for projected volume kube-api-access-zjf97 for pod openshift-network-diagnostics/network-check-target-xbzwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:16.957442 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:16.957316 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97 podName:3b7788b5-7a25-42a0-8536-8a543283bb1e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:32.957301872 +0000 UTC m=+33.336645663 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zjf97" (UniqueName: "kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97") pod "network-check-target-xbzwj" (UID: "3b7788b5-7a25-42a0-8536-8a543283bb1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:17.249781 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:17.249709 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:17.249938 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:17.249713 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:17.249938 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:17.249814 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:17.249938 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:17.249889 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:19.249542 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:19.249525 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:19.249821 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:19.249546 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:19.249821 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:19.249607 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:19.249821 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:19.249745 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:20.420697 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.420520 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-n6fpq" event={"ID":"eb68dc81-dce9-4f3e-bf24-94eff6bc04bf","Type":"ContainerStarted","Data":"3542381dd22a5740766320b3cf7086835d7aa0981edf8080fe1b583c97f027b2"} Apr 17 17:24:20.423045 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.423024 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"9705071ac33e25fa5ff4e0eebbf36bb5d06b3f151f59958a2c2187c253814e16"} Apr 17 17:24:20.423117 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.423055 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"d98e519b6ee66ef7145913e4484f9c53daaae830c6ab6c8d12b9e4e6e1e71892"} Apr 17 17:24:20.423117 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.423071 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"bba7369f881d36d3d2ff7a6d323a6c46e7839dc51fbe3dc2819e5772d2749b2b"} Apr 17 17:24:20.423117 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.423083 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"78f3fc0db00a243e2a5c88a66f5554b872daf0e6529cebd23fbcd05d067f1200"} Apr 17 17:24:20.423117 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.423098 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"be48b1b93d4d1def523a716d3f338586feca0ce362093007510e7952a0bed699"} Apr 17 17:24:20.423117 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.423110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"b43410181823156bd1ac5ab0ec7d7de72e775d792092ab800e44fe2589d460d3"} Apr 17 17:24:20.424232 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.424210 2570 generic.go:358] "Generic (PLEG): container finished" podID="5edfcc530d2d3e0bfbb4e13a5b40905e" containerID="4d773bdabad9e6f5922842d5d3a63b92085e05b1d92772efbab2720e829db918" exitCode=0 Apr 17 17:24:20.424315 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.424283 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" event={"ID":"5edfcc530d2d3e0bfbb4e13a5b40905e","Type":"ContainerDied","Data":"4d773bdabad9e6f5922842d5d3a63b92085e05b1d92772efbab2720e829db918"} Apr 17 17:24:20.425567 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.425534 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" event={"ID":"697494af72639fc0eb382c71525a5808","Type":"ContainerStarted","Data":"5e3674d88107aee081f9c0b1fc3f000d9c68a2332309ee35bd7804097e7b4753"} Apr 17 17:24:20.426930 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.426905 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" event={"ID":"61a26f4f-99b7-4f63-a4a2-b6fef61dff87","Type":"ContainerStarted","Data":"4ed50cba5ce16d0328a82d3e1c23bc415099e5c8e7c76079e68ff47170468f90"} Apr 17 17:24:20.428276 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.428255 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b6njn" event={"ID":"e75ff2ec-f311-467f-a5b6-86322293f3ed","Type":"ContainerStarted","Data":"c722b92d794c6c007063cb4f958d047a0e02d13133b45df166b4f6454763f86f"} Apr 17 17:24:20.429486 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.429464 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xtx74" event={"ID":"9f3a808b-95e3-410b-bcf1-257bf1254f04","Type":"ContainerStarted","Data":"eb4038c13ccbc00dd5dbea9cf7a85095277a2c8cf4cf2c30e0c0733ea02ae674"} Apr 17 17:24:20.430628 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.430607 2570 generic.go:358] "Generic (PLEG): container finished" podID="02eb784a-744c-4eac-bec8-46a3df516ca9" containerID="5e4e2c617d36672475f5a7aae50b48281f059d36d477b92df69a8e6c6a5c6301" exitCode=0 Apr 17 17:24:20.430703 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.430670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" event={"ID":"02eb784a-744c-4eac-bec8-46a3df516ca9","Type":"ContainerDied","Data":"5e4e2c617d36672475f5a7aae50b48281f059d36d477b92df69a8e6c6a5c6301"} Apr 17 17:24:20.431970 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.431944 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-426x5" event={"ID":"1f813146-daee-4a79-a436-af839213c97f","Type":"ContainerStarted","Data":"46f8c98097101557e7929aa1d1b0e3ff7a5b63a8bde35788ebe7dc3bb91b0560"} Apr 17 17:24:20.433404 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.433386 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ks8m2" event={"ID":"8947a5af-e839-4e78-8aef-37f0885ea400","Type":"ContainerStarted","Data":"f45af21aed48e1af8f41509466a43826d3339711017339c4b5c0a9f4e92a10f6"} Apr 17 17:24:20.438428 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.438381 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-n6fpq" podStartSLOduration=2.749412086 podStartE2EDuration="20.438365762s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.503371259 +0000 UTC m=+1.882715056" lastFinishedPulling="2026-04-17 17:24:19.192324928 +0000 UTC m=+19.571668732" observedRunningTime="2026-04-17 17:24:20.437545812 +0000 UTC m=+20.816889626" watchObservedRunningTime="2026-04-17 17:24:20.438365762 +0000 UTC m=+20.817709590" Apr 17 17:24:20.457481 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.457275 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-426x5" podStartSLOduration=2.7911125 podStartE2EDuration="20.457262242s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.528864163 +0000 UTC m=+1.908207959" lastFinishedPulling="2026-04-17 17:24:19.1950139 +0000 UTC m=+19.574357701" observedRunningTime="2026-04-17 17:24:20.456878522 +0000 UTC m=+20.836222335" watchObservedRunningTime="2026-04-17 17:24:20.457262242 +0000 UTC m=+20.836606055" Apr 17 17:24:20.494564 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.494491 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-202.ec2.internal" podStartSLOduration=19.494476566 podStartE2EDuration="19.494476566s" podCreationTimestamp="2026-04-17 17:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:20.470956226 +0000 UTC m=+20.850300040" watchObservedRunningTime="2026-04-17 17:24:20.494476566 +0000 UTC m=+20.873820380" Apr 17 17:24:20.511636 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.511602 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ks8m2" podStartSLOduration=2.844247538 podStartE2EDuration="20.511591653s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.506584845 +0000 UTC m=+1.885928639" lastFinishedPulling="2026-04-17 17:24:19.173928955 +0000 UTC m=+19.553272754" observedRunningTime="2026-04-17 17:24:20.51137982 +0000 UTC m=+20.890723632" watchObservedRunningTime="2026-04-17 17:24:20.511591653 +0000 UTC m=+20.890935495" Apr 17 17:24:20.526646 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.526613 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b6njn" podStartSLOduration=2.847206594 podStartE2EDuration="20.526603212s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.513353644 +0000 UTC m=+1.892697439" lastFinishedPulling="2026-04-17 17:24:19.192750259 +0000 UTC m=+19.572094057" observedRunningTime="2026-04-17 17:24:20.526316486 +0000 UTC m=+20.905660298" watchObservedRunningTime="2026-04-17 17:24:20.526603212 +0000 UTC m=+20.905947024" Apr 17 17:24:20.558503 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:20.558470 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xtx74" podStartSLOduration=2.822811621 podStartE2EDuration="20.558460483s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.470940792 +0000 UTC m=+1.850284588" lastFinishedPulling="2026-04-17 17:24:19.206589656 +0000 UTC m=+19.585933450" observedRunningTime="2026-04-17 17:24:20.558160649 +0000 UTC m=+20.937504455" watchObservedRunningTime="2026-04-17 17:24:20.558460483 +0000 UTC m=+20.937804296" Apr 17 17:24:21.249559 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:21.249136 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:21.249559 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:21.249133 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:21.249559 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:21.249289 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:21.249559 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:21.249358 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:21.437499 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:21.437469 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" event={"ID":"5edfcc530d2d3e0bfbb4e13a5b40905e","Type":"ContainerStarted","Data":"04363015350f573988a5f404788289955c4fa845d3caf6403565ece270c45a9c"} Apr 17 17:24:21.442786 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:21.442761 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gbft6" event={"ID":"7683382f-975a-43b6-9d75-7d14283c2328","Type":"ContainerStarted","Data":"66f34c46bb30b1eefc500083c7aef5870c015df0c46b40d2c9b34d158bf7a192"} Apr 17 17:24:21.445422 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:21.445400 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:24:21.454098 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:21.454061 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-202.ec2.internal" podStartSLOduration=20.45405012 podStartE2EDuration="20.45405012s" podCreationTimestamp="2026-04-17 17:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:21.453491642 +0000 UTC m=+21.832835455" watchObservedRunningTime="2026-04-17 17:24:21.45405012 +0000 UTC m=+21.833393932" Apr 17 17:24:21.467789 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:21.467759 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gbft6" podStartSLOduration=3.79427527 podStartE2EDuration="21.467749714s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.520366567 +0000 UTC m=+1.899710373" lastFinishedPulling="2026-04-17 17:24:19.193841013 +0000 UTC m=+19.573184817" observedRunningTime="2026-04-17 17:24:21.467727488 +0000 UTC m=+21.847071301" watchObservedRunningTime="2026-04-17 17:24:21.467749714 +0000 UTC m=+21.847093549" Apr 17 17:24:22.192555 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:22.192436 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:24:21.445419026Z","UUID":"273deb40-3ef6-4263-af26-48f86e789bf8","Handler":null,"Name":"","Endpoint":""} Apr 17 17:24:22.194499 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:22.194477 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:24:22.194596 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:22.194510 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:24:22.446999 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:22.446924 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"17dceab5fcf90dc633660b5ca4703e66e41533375f5b38e99910ba8542b10c1a"} Apr 17 17:24:22.448840 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:22.448795 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" event={"ID":"61a26f4f-99b7-4f63-a4a2-b6fef61dff87","Type":"ContainerStarted","Data":"026687b1d0bde67536bd0a47fe621fc05263a9b8fa69c276e6be45f7ac637c50"} Apr 17 17:24:22.448840 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:22.448831 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" event={"ID":"61a26f4f-99b7-4f63-a4a2-b6fef61dff87","Type":"ContainerStarted","Data":"bc63d0b0676da8fb5d0e694f05a98fd1ef88000a3c8722947919282f71b27ab1"} Apr 17 17:24:22.465353 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:22.465300 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hjzlb" podStartSLOduration=1.830522433 podStartE2EDuration="22.465287881s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.525820106 +0000 UTC m=+1.905163902" lastFinishedPulling="2026-04-17 17:24:22.160585558 +0000 UTC m=+22.539929350" observedRunningTime="2026-04-17 17:24:22.465059971 +0000 UTC m=+22.844403787" watchObservedRunningTime="2026-04-17 17:24:22.465287881 +0000 UTC m=+22.844631695" Apr 17 17:24:23.249751 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:23.249724 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:23.249929 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:23.249736 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:23.249929 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:23.249832 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:23.250031 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:23.249950 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:23.580568 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:23.580528 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:23.581693 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:23.581508 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:24.456615 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:24.456192 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" event={"ID":"0a83ec74-42da-427f-be72-02e777b9626c","Type":"ContainerStarted","Data":"168acdefc43c17d7b148499e31c817968bdb9dd4d9d183ae94ab18730b10dd20"} Apr 17 17:24:24.456615 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:24.456587 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:24.457217 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:24.457082 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-n6fpq" Apr 17 17:24:24.487089 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:24.487042 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" podStartSLOduration=6.28981345 podStartE2EDuration="24.487024513s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.490126075 +0000 UTC m=+1.869469866" lastFinishedPulling="2026-04-17 17:24:19.687337134 +0000 UTC m=+20.066680929" observedRunningTime="2026-04-17 17:24:24.486009794 +0000 UTC m=+24.865353607" watchObservedRunningTime="2026-04-17 17:24:24.487024513 +0000 UTC m=+24.866368329" Apr 17 17:24:25.249459 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.249433 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:25.249459 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.249447 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:25.249821 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:25.249519 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:25.249821 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:25.249636 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:25.459697 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.459669 2570 generic.go:358] "Generic (PLEG): container finished" podID="02eb784a-744c-4eac-bec8-46a3df516ca9" containerID="1edea36f65254f2548453fea3c040dac689f91894ab041a67c6a8da63c337ace" exitCode=0 Apr 17 17:24:25.459832 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.459751 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" event={"ID":"02eb784a-744c-4eac-bec8-46a3df516ca9","Type":"ContainerDied","Data":"1edea36f65254f2548453fea3c040dac689f91894ab041a67c6a8da63c337ace"} Apr 17 17:24:25.460769 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.460480 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:25.460769 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.460507 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:25.460769 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.460520 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:25.474712 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.474692 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:25.474849 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:25.474836 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:24:26.308113 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:26.307958 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xbzwj"] Apr 17 17:24:26.308573 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:26.308232 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:26.308573 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:26.308327 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:26.308713 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:26.308694 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zcn2s"] Apr 17 17:24:26.308796 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:26.308786 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:26.308876 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:26.308861 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:26.463521 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:26.463464 2570 generic.go:358] "Generic (PLEG): container finished" podID="02eb784a-744c-4eac-bec8-46a3df516ca9" containerID="0767e1b6f6e2943632d29b4a55758c46df037f0bfbec886ad2788c6dbc9c5ddc" exitCode=0 Apr 17 17:24:26.463623 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:26.463554 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" event={"ID":"02eb784a-744c-4eac-bec8-46a3df516ca9","Type":"ContainerDied","Data":"0767e1b6f6e2943632d29b4a55758c46df037f0bfbec886ad2788c6dbc9c5ddc"} Apr 17 17:24:27.467230 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:27.467173 2570 generic.go:358] "Generic (PLEG): container finished" podID="02eb784a-744c-4eac-bec8-46a3df516ca9" containerID="2f924a1b11171958a29bc54dff5adf42b13f3c3eca86d9dfc64d40370f2777f3" exitCode=0 Apr 17 17:24:27.467565 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:27.467252 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" event={"ID":"02eb784a-744c-4eac-bec8-46a3df516ca9","Type":"ContainerDied","Data":"2f924a1b11171958a29bc54dff5adf42b13f3c3eca86d9dfc64d40370f2777f3"} Apr 17 17:24:28.249446 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:28.249409 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:28.249446 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:28.249445 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:28.249652 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:28.249539 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:28.249706 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:28.249671 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:30.250071 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:30.250027 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:30.250666 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:30.250138 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:30.250666 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:30.250210 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:30.250666 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:30.250312 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:32.249990 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.249783 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:32.250411 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.249837 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:32.250411 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.250107 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:24:32.250411 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.250176 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xbzwj" podUID="3b7788b5-7a25-42a0-8536-8a543283bb1e" Apr 17 17:24:32.386945 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.386876 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-202.ec2.internal" event="NodeReady" Apr 17 17:24:32.387092 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.387011 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:24:32.453012 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.452983 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qbjvx"] Apr 17 17:24:32.482262 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.482232 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-szcn9"] Apr 17 17:24:32.482396 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.482378 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.486989 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.486867 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:24:32.486989 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.486920 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qbw75\"" Apr 17 17:24:32.487160 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.486867 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:24:32.499591 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.499571 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qbjvx"] Apr 17 17:24:32.499687 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.499598 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-szcn9"] Apr 17 17:24:32.499687 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.499666 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:32.501955 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.501927 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:24:32.501955 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.501930 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:24:32.502115 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.501997 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hd29f\"" Apr 17 17:24:32.502406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.502388 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:24:32.577328 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.577295 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6dm4\" (UniqueName: \"kubernetes.io/projected/e067e25c-97d9-490a-b6a9-e17550577b5e-kube-api-access-b6dm4\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:32.577447 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.577342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d8aa389-0914-474c-8a7a-11a05cb1ee33-config-volume\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.577447 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.577372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcwzz\" (UniqueName: \"kubernetes.io/projected/5d8aa389-0914-474c-8a7a-11a05cb1ee33-kube-api-access-gcwzz\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.577560 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.577464 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.577560 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.577509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5d8aa389-0914-474c-8a7a-11a05cb1ee33-tmp-dir\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.577560 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.577541 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:32.678794 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.678702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.678794 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.678754 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5d8aa389-0914-474c-8a7a-11a05cb1ee33-tmp-dir\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.678794 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.678784 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:32.679018 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.678814 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6dm4\" (UniqueName: \"kubernetes.io/projected/e067e25c-97d9-490a-b6a9-e17550577b5e-kube-api-access-b6dm4\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:32.679018 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.678825 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:32.679018 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.678843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d8aa389-0914-474c-8a7a-11a05cb1ee33-config-volume\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.679018 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.678873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcwzz\" (UniqueName: \"kubernetes.io/projected/5d8aa389-0914-474c-8a7a-11a05cb1ee33-kube-api-access-gcwzz\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.679018 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.678894 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls podName:5d8aa389-0914-474c-8a7a-11a05cb1ee33 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:33.178873864 +0000 UTC m=+33.558217673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls") pod "dns-default-qbjvx" (UID: "5d8aa389-0914-474c-8a7a-11a05cb1ee33") : secret "dns-default-metrics-tls" not found Apr 17 17:24:32.679018 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.678922 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:32.679018 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.678991 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert podName:e067e25c-97d9-490a-b6a9-e17550577b5e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:33.17897306 +0000 UTC m=+33.558316866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert") pod "ingress-canary-szcn9" (UID: "e067e25c-97d9-490a-b6a9-e17550577b5e") : secret "canary-serving-cert" not found Apr 17 17:24:32.679353 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.679173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5d8aa389-0914-474c-8a7a-11a05cb1ee33-tmp-dir\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.679585 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.679552 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d8aa389-0914-474c-8a7a-11a05cb1ee33-config-volume\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.690356 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.690295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcwzz\" (UniqueName: \"kubernetes.io/projected/5d8aa389-0914-474c-8a7a-11a05cb1ee33-kube-api-access-gcwzz\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:32.690471 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.690381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6dm4\" (UniqueName: \"kubernetes.io/projected/e067e25c-97d9-490a-b6a9-e17550577b5e-kube-api-access-b6dm4\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:32.881259 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.881226 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:32.881410 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.881344 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:32.881476 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.881417 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:04.881397767 +0000 UTC m=+65.260741583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:32.982498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:32.982420 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:32.982631 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.982579 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:32.982631 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.982597 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:32.982631 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.982607 2570 projected.go:194] Error preparing data for projected volume kube-api-access-zjf97 for pod openshift-network-diagnostics/network-check-target-xbzwj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:32.982741 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:32.982666 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97 podName:3b7788b5-7a25-42a0-8536-8a543283bb1e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:04.982651059 +0000 UTC m=+65.361994849 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zjf97" (UniqueName: "kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97") pod "network-check-target-xbzwj" (UID: "3b7788b5-7a25-42a0-8536-8a543283bb1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:33.183984 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:33.183958 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:33.184132 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:33.184018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:33.184132 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:33.184093 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:33.184132 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:33.184125 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:33.184313 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:33.184164 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert podName:e067e25c-97d9-490a-b6a9-e17550577b5e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:34.184133436 +0000 UTC m=+34.563477228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert") pod "ingress-canary-szcn9" (UID: "e067e25c-97d9-490a-b6a9-e17550577b5e") : secret "canary-serving-cert" not found Apr 17 17:24:33.184313 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:33.184185 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls podName:5d8aa389-0914-474c-8a7a-11a05cb1ee33 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:34.184173382 +0000 UTC m=+34.563517173 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls") pod "dns-default-qbjvx" (UID: "5d8aa389-0914-474c-8a7a-11a05cb1ee33") : secret "dns-default-metrics-tls" not found Apr 17 17:24:34.190102 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.190071 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:34.190470 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.190118 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:34.190470 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:34.190252 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:34.190470 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:34.190316 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert podName:e067e25c-97d9-490a-b6a9-e17550577b5e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:36.190301354 +0000 UTC m=+36.569645150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert") pod "ingress-canary-szcn9" (UID: "e067e25c-97d9-490a-b6a9-e17550577b5e") : secret "canary-serving-cert" not found Apr 17 17:24:34.190470 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:34.190252 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:34.190470 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:34.190402 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls podName:5d8aa389-0914-474c-8a7a-11a05cb1ee33 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:36.190387937 +0000 UTC m=+36.569731737 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls") pod "dns-default-qbjvx" (UID: "5d8aa389-0914-474c-8a7a-11a05cb1ee33") : secret "dns-default-metrics-tls" not found Apr 17 17:24:34.249984 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.249953 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:24:34.250090 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.250008 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:24:34.255366 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.255335 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:24:34.255490 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.255387 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:24:34.256441 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.256420 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7c8rp\"" Apr 17 17:24:34.256527 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.256438 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:24:34.256527 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.256474 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zp45b\"" Apr 17 17:24:34.482606 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.482545 2570 generic.go:358] "Generic (PLEG): container finished" podID="02eb784a-744c-4eac-bec8-46a3df516ca9" containerID="b4e46a4e1a4edf61efb59a9cda5e8743c10f7848544aeadd7d68af628f4b5c0d" exitCode=0 Apr 17 17:24:34.482606 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:34.482584 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" event={"ID":"02eb784a-744c-4eac-bec8-46a3df516ca9","Type":"ContainerDied","Data":"b4e46a4e1a4edf61efb59a9cda5e8743c10f7848544aeadd7d68af628f4b5c0d"} Apr 17 17:24:35.486439 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:35.486408 2570 generic.go:358] "Generic (PLEG): container finished" podID="02eb784a-744c-4eac-bec8-46a3df516ca9" containerID="848ca62267e0ab2642935e80e620046e27b5a682cf45eef7dde576efffb665ad" exitCode=0 Apr 17 17:24:35.486772 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:35.486472 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" event={"ID":"02eb784a-744c-4eac-bec8-46a3df516ca9","Type":"ContainerDied","Data":"848ca62267e0ab2642935e80e620046e27b5a682cf45eef7dde576efffb665ad"} Apr 17 17:24:36.202071 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:36.201891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:36.202071 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:36.202059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:36.202224 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:36.202024 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:36.202224 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:36.202139 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:36.202224 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:36.202180 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls podName:5d8aa389-0914-474c-8a7a-11a05cb1ee33 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:40.20216242 +0000 UTC m=+40.581506211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls") pod "dns-default-qbjvx" (UID: "5d8aa389-0914-474c-8a7a-11a05cb1ee33") : secret "dns-default-metrics-tls" not found Apr 17 17:24:36.202224 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:36.202194 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert podName:e067e25c-97d9-490a-b6a9-e17550577b5e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:40.202187774 +0000 UTC m=+40.581531564 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert") pod "ingress-canary-szcn9" (UID: "e067e25c-97d9-490a-b6a9-e17550577b5e") : secret "canary-serving-cert" not found Apr 17 17:24:36.490573 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:36.490518 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" event={"ID":"02eb784a-744c-4eac-bec8-46a3df516ca9","Type":"ContainerStarted","Data":"060864f1f643d3278a1be480a09e13e52c4e42fc41619471c5df35f80af7db79"} Apr 17 17:24:36.512966 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:36.512920 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j4mx4" podStartSLOduration=4.617220638 podStartE2EDuration="36.512906513s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:01.451470057 +0000 UTC m=+1.830813848" lastFinishedPulling="2026-04-17 17:24:33.347155932 +0000 UTC m=+33.726499723" observedRunningTime="2026-04-17 17:24:36.511851108 +0000 UTC m=+36.891194921" watchObservedRunningTime="2026-04-17 17:24:36.512906513 +0000 UTC m=+36.892250326" Apr 17 17:24:40.226448 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:40.226412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:40.226448 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:40.226456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:40.226922 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:40.226551 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:40.226922 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:40.226569 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:40.226922 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:40.226598 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert podName:e067e25c-97d9-490a-b6a9-e17550577b5e nodeName:}" failed. No retries permitted until 2026-04-17 17:24:48.226584074 +0000 UTC m=+48.605927865 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert") pod "ingress-canary-szcn9" (UID: "e067e25c-97d9-490a-b6a9-e17550577b5e") : secret "canary-serving-cert" not found Apr 17 17:24:40.226922 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:40.226652 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls podName:5d8aa389-0914-474c-8a7a-11a05cb1ee33 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:48.22663123 +0000 UTC m=+48.605975038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls") pod "dns-default-qbjvx" (UID: "5d8aa389-0914-474c-8a7a-11a05cb1ee33") : secret "dns-default-metrics-tls" not found Apr 17 17:24:48.275928 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:48.275891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:24:48.276396 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:48.275938 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:24:48.276396 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:48.276037 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:48.276396 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:48.276049 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:48.276396 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:48.276115 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert podName:e067e25c-97d9-490a-b6a9-e17550577b5e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:04.276093814 +0000 UTC m=+64.655437634 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert") pod "ingress-canary-szcn9" (UID: "e067e25c-97d9-490a-b6a9-e17550577b5e") : secret "canary-serving-cert" not found Apr 17 17:24:48.276396 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:24:48.276131 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls podName:5d8aa389-0914-474c-8a7a-11a05cb1ee33 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:04.276124068 +0000 UTC m=+64.655467860 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls") pod "dns-default-qbjvx" (UID: "5d8aa389-0914-474c-8a7a-11a05cb1ee33") : secret "dns-default-metrics-tls" not found Apr 17 17:24:57.483664 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:24:57.483634 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42tv5" Apr 17 17:25:04.375620 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:04.375581 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:25:04.375620 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:04.375626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:25:04.376060 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:04.375716 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:04.376060 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:04.375730 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:04.376060 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:04.375771 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert podName:e067e25c-97d9-490a-b6a9-e17550577b5e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:36.375754435 +0000 UTC m=+96.755098226 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert") pod "ingress-canary-szcn9" (UID: "e067e25c-97d9-490a-b6a9-e17550577b5e") : secret "canary-serving-cert" not found Apr 17 17:25:04.376060 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:04.375786 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls podName:5d8aa389-0914-474c-8a7a-11a05cb1ee33 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:36.37577859 +0000 UTC m=+96.755122381 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls") pod "dns-default-qbjvx" (UID: "5d8aa389-0914-474c-8a7a-11a05cb1ee33") : secret "dns-default-metrics-tls" not found Apr 17 17:25:04.980671 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:04.980639 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:25:04.983188 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:04.983168 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:04.991204 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:04.991186 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:25:04.991270 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:04.991242 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:08.991227793 +0000 UTC m=+129.370571583 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : secret "metrics-daemon-secret" not found Apr 17 17:25:05.081570 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:05.081545 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:25:05.084172 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:05.084136 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.094654 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:05.094638 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.106317 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:05.106296 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjf97\" (UniqueName: \"kubernetes.io/projected/3b7788b5-7a25-42a0-8536-8a543283bb1e-kube-api-access-zjf97\") pod \"network-check-target-xbzwj\" (UID: \"3b7788b5-7a25-42a0-8536-8a543283bb1e\") " pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:25:05.172273 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:05.172251 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zp45b\"" Apr 17 17:25:05.180171 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:05.180155 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:25:05.378030 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:05.378000 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xbzwj"] Apr 17 17:25:05.381002 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:25:05.380968 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b7788b5_7a25_42a0_8536_8a543283bb1e.slice/crio-6f76ac4ad522fddbb6892425649cfa113fc03db533a7024626335c963e70198d WatchSource:0}: Error finding container 6f76ac4ad522fddbb6892425649cfa113fc03db533a7024626335c963e70198d: Status 404 returned error can't find the container with id 6f76ac4ad522fddbb6892425649cfa113fc03db533a7024626335c963e70198d Apr 17 17:25:05.543221 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:05.543162 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xbzwj" event={"ID":"3b7788b5-7a25-42a0-8536-8a543283bb1e","Type":"ContainerStarted","Data":"6f76ac4ad522fddbb6892425649cfa113fc03db533a7024626335c963e70198d"} Apr 17 17:25:06.994583 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:06.994251 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq"] Apr 17 17:25:06.997554 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:06.997532 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.000097 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.000064 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:25:07.000097 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.000062 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:25:07.000903 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.000882 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:25:07.001003 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.000884 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 17:25:07.006986 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.006959 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq"] Apr 17 17:25:07.035729 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.035704 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44"] Apr 17 17:25:07.038987 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.038970 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.041473 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.041445 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 17:25:07.041572 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.041533 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 17:25:07.041647 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.041606 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 17:25:07.041802 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.041785 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 17:25:07.051898 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.051872 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44"] Apr 17 17:25:07.095716 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.095693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/52cba862-4b49-4900-93a0-4c715e2d4446-klusterlet-config\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.095850 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.095759 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9m2\" (UniqueName: \"kubernetes.io/projected/52cba862-4b49-4900-93a0-4c715e2d4446-kube-api-access-xd9m2\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.095850 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.095816 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52cba862-4b49-4900-93a0-4c715e2d4446-tmp\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.196202 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-ca\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.196342 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196218 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.196342 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196252 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/52cba862-4b49-4900-93a0-4c715e2d4446-klusterlet-config\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.196430 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196387 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4aa0a085-d8e6-4779-b093-a05f445a240b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.196430 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196421 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-256zz\" (UniqueName: \"kubernetes.io/projected/4aa0a085-d8e6-4779-b093-a05f445a240b-kube-api-access-256zz\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.196527 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196464 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.196566 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9m2\" (UniqueName: \"kubernetes.io/projected/52cba862-4b49-4900-93a0-4c715e2d4446-kube-api-access-xd9m2\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.196604 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52cba862-4b49-4900-93a0-4c715e2d4446-tmp\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.196647 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196606 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-hub\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.196994 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.196975 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52cba862-4b49-4900-93a0-4c715e2d4446-tmp\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.199197 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.199180 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/52cba862-4b49-4900-93a0-4c715e2d4446-klusterlet-config\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.205263 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.205240 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9m2\" (UniqueName: \"kubernetes.io/projected/52cba862-4b49-4900-93a0-4c715e2d4446-kube-api-access-xd9m2\") pod \"klusterlet-addon-workmgr-75658d6877-nwwxq\" (UID: \"52cba862-4b49-4900-93a0-4c715e2d4446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.297564 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.297538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-ca\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.297683 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.297571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.297683 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.297614 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4aa0a085-d8e6-4779-b093-a05f445a240b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.297683 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.297640 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-256zz\" (UniqueName: \"kubernetes.io/projected/4aa0a085-d8e6-4779-b093-a05f445a240b-kube-api-access-256zz\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.297840 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.297688 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.297840 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.297729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-hub\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.298409 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.298385 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4aa0a085-d8e6-4779-b093-a05f445a240b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.300403 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.300378 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-hub\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.300487 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.300406 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-ca\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.300595 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.300576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.300656 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.300607 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4aa0a085-d8e6-4779-b093-a05f445a240b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.306461 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.306429 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-256zz\" (UniqueName: \"kubernetes.io/projected/4aa0a085-d8e6-4779-b093-a05f445a240b-kube-api-access-256zz\") pod \"cluster-proxy-proxy-agent-c46845d64-57n44\" (UID: \"4aa0a085-d8e6-4779-b093-a05f445a240b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:07.309195 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.309157 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:07.360897 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:07.360876 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:25:08.155613 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:08.155589 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq"] Apr 17 17:25:08.158394 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:25:08.158366 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52cba862_4b49_4900_93a0_4c715e2d4446.slice/crio-cf9d7c9ead0fe3f7608a893cf0c5afd854120b1ab678213c0a7cb61b2b052b49 WatchSource:0}: Error finding container cf9d7c9ead0fe3f7608a893cf0c5afd854120b1ab678213c0a7cb61b2b052b49: Status 404 returned error can't find the container with id cf9d7c9ead0fe3f7608a893cf0c5afd854120b1ab678213c0a7cb61b2b052b49 Apr 17 17:25:08.158394 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:08.158382 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44"] Apr 17 17:25:08.161387 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:25:08.161365 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aa0a085_d8e6_4779_b093_a05f445a240b.slice/crio-694dcde19849314415d6dcdd31b581cff933ed5aa0ba0ec0e9e9d1cf2a09880d WatchSource:0}: Error finding container 694dcde19849314415d6dcdd31b581cff933ed5aa0ba0ec0e9e9d1cf2a09880d: Status 404 returned error can't find the container with id 694dcde19849314415d6dcdd31b581cff933ed5aa0ba0ec0e9e9d1cf2a09880d Apr 17 17:25:08.549678 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:08.549645 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xbzwj" event={"ID":"3b7788b5-7a25-42a0-8536-8a543283bb1e","Type":"ContainerStarted","Data":"0a2e2d06d4ec5594fd00f1410366d066e280ccb95ef6c03ae000c99990d868bd"} Apr 17 17:25:08.549807 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:08.549750 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:25:08.550728 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:08.550704 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" event={"ID":"4aa0a085-d8e6-4779-b093-a05f445a240b","Type":"ContainerStarted","Data":"694dcde19849314415d6dcdd31b581cff933ed5aa0ba0ec0e9e9d1cf2a09880d"} Apr 17 17:25:08.551649 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:08.551626 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" event={"ID":"52cba862-4b49-4900-93a0-4c715e2d4446","Type":"ContainerStarted","Data":"cf9d7c9ead0fe3f7608a893cf0c5afd854120b1ab678213c0a7cb61b2b052b49"} Apr 17 17:25:08.568272 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:08.568233 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xbzwj" podStartSLOduration=65.888147567 podStartE2EDuration="1m8.568222073s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:25:05.38272622 +0000 UTC m=+65.762070010" lastFinishedPulling="2026-04-17 17:25:08.062800712 +0000 UTC m=+68.442144516" observedRunningTime="2026-04-17 17:25:08.566688749 +0000 UTC m=+68.946032559" watchObservedRunningTime="2026-04-17 17:25:08.568222073 +0000 UTC m=+68.947565877" Apr 17 17:25:12.561362 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:12.561322 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" event={"ID":"4aa0a085-d8e6-4779-b093-a05f445a240b","Type":"ContainerStarted","Data":"eaff37bdaaa3fd642820066fe0b104b16d116496c83b1d3d0d3da9a0a5f9947f"} Apr 17 17:25:15.569555 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:15.569509 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" event={"ID":"4aa0a085-d8e6-4779-b093-a05f445a240b","Type":"ContainerStarted","Data":"545a2627d1ff9c39ab987bf43cfef4db825e581121feda9b460410b1a596c681"} Apr 17 17:25:15.569555 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:15.569555 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" event={"ID":"4aa0a085-d8e6-4779-b093-a05f445a240b","Type":"ContainerStarted","Data":"4ce39beeabd9ca985b39c22fdaa7fe2b5ea00828ffab513f58fdbcb1b03760a0"} Apr 17 17:25:15.571169 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:15.571121 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" event={"ID":"52cba862-4b49-4900-93a0-4c715e2d4446","Type":"ContainerStarted","Data":"8eb43b37dbcc1154f414d3d821d787d7e3c87d38ce5e8d0e9b29baac67c11481"} Apr 17 17:25:15.571335 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:15.571322 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:15.573208 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:15.573185 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" Apr 17 17:25:15.590432 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:15.590386 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" podStartSLOduration=1.684835349 podStartE2EDuration="8.590369451s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.162909143 +0000 UTC m=+68.542252934" lastFinishedPulling="2026-04-17 17:25:15.06844323 +0000 UTC m=+75.447787036" observedRunningTime="2026-04-17 17:25:15.588868684 +0000 UTC m=+75.968212515" watchObservedRunningTime="2026-04-17 17:25:15.590369451 +0000 UTC m=+75.969713265" Apr 17 17:25:15.603302 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:15.603260 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75658d6877-nwwxq" podStartSLOduration=2.682593833 podStartE2EDuration="9.603245419s" podCreationTimestamp="2026-04-17 17:25:06 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.159991973 +0000 UTC m=+68.539335768" lastFinishedPulling="2026-04-17 17:25:15.080643554 +0000 UTC m=+75.459987354" observedRunningTime="2026-04-17 17:25:15.602710095 +0000 UTC m=+75.982053903" watchObservedRunningTime="2026-04-17 17:25:15.603245419 +0000 UTC m=+75.982589232" Apr 17 17:25:36.390638 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:36.390595 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:25:36.391031 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:36.390654 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:25:36.391031 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:36.390753 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:36.391031 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:36.390827 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls podName:5d8aa389-0914-474c-8a7a-11a05cb1ee33 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:40.39080996 +0000 UTC m=+160.770153751 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls") pod "dns-default-qbjvx" (UID: "5d8aa389-0914-474c-8a7a-11a05cb1ee33") : secret "dns-default-metrics-tls" not found Apr 17 17:25:36.391031 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:36.390753 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:36.391031 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:25:36.390898 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert podName:e067e25c-97d9-490a-b6a9-e17550577b5e nodeName:}" failed. No retries permitted until 2026-04-17 17:26:40.390885819 +0000 UTC m=+160.770229627 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert") pod "ingress-canary-szcn9" (UID: "e067e25c-97d9-490a-b6a9-e17550577b5e") : secret "canary-serving-cert" not found Apr 17 17:25:39.556786 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:39.556748 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xbzwj" Apr 17 17:25:58.594160 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:58.594107 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b6njn_e75ff2ec-f311-467f-a5b6-86322293f3ed/dns-node-resolver/0.log" Apr 17 17:25:59.996014 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:25:59.995978 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ks8m2_8947a5af-e839-4e78-8aef-37f0885ea400/node-ca/0.log" Apr 17 17:26:08.998306 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:08.998257 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:26:08.998773 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:08.998383 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:26:08.998773 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:08.998453 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs podName:82dffe03-2b0c-4ac4-bb02-5a8430704805 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:10.998437795 +0000 UTC m=+251.377781585 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs") pod "network-metrics-daemon-zcn2s" (UID: "82dffe03-2b0c-4ac4-bb02-5a8430704805") : secret "metrics-daemon-secret" not found Apr 17 17:26:20.629703 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.629667 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76d6f5448c-dqvzx"] Apr 17 17:26:20.632599 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.632583 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.635346 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.635318 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:26:20.635502 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.635479 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:26:20.635564 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.635479 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jk79w\"" Apr 17 17:26:20.636393 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.636376 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:26:20.643809 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.643785 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:26:20.657400 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.657378 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76d6f5448c-dqvzx"] Apr 17 17:26:20.660706 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.660682 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-88mpn"] Apr 17 17:26:20.663615 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.663598 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.666405 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.666386 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:26:20.666488 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.666431 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:26:20.666543 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.666517 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fcb89\"" Apr 17 17:26:20.666583 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.666566 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:26:20.667175 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.667158 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:26:20.677073 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.677053 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-88mpn"] Apr 17 17:26:20.772137 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772113 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc52916-462a-4208-9837-dd6fbb842e70-trusted-ca\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.772252 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772166 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6dc52916-462a-4208-9837-dd6fbb842e70-installation-pull-secrets\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.772252 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772223 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9cwr\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-kube-api-access-c9cwr\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.772340 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772253 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7f76d408-5470-4147-8801-3af404c3630b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.772340 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7f76d408-5470-4147-8801-3af404c3630b-crio-socket\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.772340 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772295 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6dc52916-462a-4208-9837-dd6fbb842e70-image-registry-private-configuration\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.772340 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772313 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6dc52916-462a-4208-9837-dd6fbb842e70-registry-certificates\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.772496 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772383 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7f76d408-5470-4147-8801-3af404c3630b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.772496 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772406 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-registry-tls\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.772496 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772440 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6dc52916-462a-4208-9837-dd6fbb842e70-ca-trust-extracted\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.772496 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-bound-sa-token\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.772671 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772521 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4z2\" (UniqueName: \"kubernetes.io/projected/7f76d408-5470-4147-8801-3af404c3630b-kube-api-access-cp4z2\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.772671 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.772568 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7f76d408-5470-4147-8801-3af404c3630b-data-volume\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.873097 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc52916-462a-4208-9837-dd6fbb842e70-trusted-ca\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873216 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6dc52916-462a-4208-9837-dd6fbb842e70-installation-pull-secrets\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873216 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873139 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9cwr\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-kube-api-access-c9cwr\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873216 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873186 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7f76d408-5470-4147-8801-3af404c3630b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.873389 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7f76d408-5470-4147-8801-3af404c3630b-crio-socket\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.873389 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6dc52916-462a-4208-9837-dd6fbb842e70-image-registry-private-configuration\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873389 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6dc52916-462a-4208-9837-dd6fbb842e70-registry-certificates\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873389 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7f76d408-5470-4147-8801-3af404c3630b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.873389 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-registry-tls\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873389 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873370 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7f76d408-5470-4147-8801-3af404c3630b-crio-socket\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.873667 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873372 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6dc52916-462a-4208-9837-dd6fbb842e70-ca-trust-extracted\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873667 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-bound-sa-token\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873667 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873478 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4z2\" (UniqueName: \"kubernetes.io/projected/7f76d408-5470-4147-8801-3af404c3630b-kube-api-access-cp4z2\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.873667 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7f76d408-5470-4147-8801-3af404c3630b-data-volume\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.873863 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873691 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6dc52916-462a-4208-9837-dd6fbb842e70-ca-trust-extracted\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.873863 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873807 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7f76d408-5470-4147-8801-3af404c3630b-data-volume\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.873863 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.873825 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7f76d408-5470-4147-8801-3af404c3630b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.874208 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.874183 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc52916-462a-4208-9837-dd6fbb842e70-trusted-ca\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.874701 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.874677 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6dc52916-462a-4208-9837-dd6fbb842e70-registry-certificates\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.875649 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.875624 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7f76d408-5470-4147-8801-3af404c3630b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.875791 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.875770 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6dc52916-462a-4208-9837-dd6fbb842e70-image-registry-private-configuration\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.875849 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.875792 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6dc52916-462a-4208-9837-dd6fbb842e70-installation-pull-secrets\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.875983 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.875964 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-registry-tls\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.885323 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.885279 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9cwr\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-kube-api-access-c9cwr\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.887936 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.887914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4z2\" (UniqueName: \"kubernetes.io/projected/7f76d408-5470-4147-8801-3af404c3630b-kube-api-access-cp4z2\") pod \"insights-runtime-extractor-88mpn\" (UID: \"7f76d408-5470-4147-8801-3af404c3630b\") " pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:20.888474 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.888446 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6dc52916-462a-4208-9837-dd6fbb842e70-bound-sa-token\") pod \"image-registry-76d6f5448c-dqvzx\" (UID: \"6dc52916-462a-4208-9837-dd6fbb842e70\") " pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.941509 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.941492 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:20.971616 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:20.971595 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-88mpn" Apr 17 17:26:21.064531 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:21.064505 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76d6f5448c-dqvzx"] Apr 17 17:26:21.068107 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:26:21.068083 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dc52916_462a_4208_9837_dd6fbb842e70.slice/crio-41c60a60dd3047990e4206777592ee7a087b4ce16d13524f0bbcfa6a6bc04ede WatchSource:0}: Error finding container 41c60a60dd3047990e4206777592ee7a087b4ce16d13524f0bbcfa6a6bc04ede: Status 404 returned error can't find the container with id 41c60a60dd3047990e4206777592ee7a087b4ce16d13524f0bbcfa6a6bc04ede Apr 17 17:26:21.086156 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:21.086116 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-88mpn"] Apr 17 17:26:21.105219 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:26:21.105189 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f76d408_5470_4147_8801_3af404c3630b.slice/crio-b837e32386e56fe560df06bbeebba15cbac28d10e9c78f9b4d55e52621b71575 WatchSource:0}: Error finding container b837e32386e56fe560df06bbeebba15cbac28d10e9c78f9b4d55e52621b71575: Status 404 returned error can't find the container with id b837e32386e56fe560df06bbeebba15cbac28d10e9c78f9b4d55e52621b71575 Apr 17 17:26:21.696808 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:21.696782 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-88mpn" event={"ID":"7f76d408-5470-4147-8801-3af404c3630b","Type":"ContainerStarted","Data":"d931385c085a33eeec8d46bdbc09b5e659f313602efc9ba4d6000c4f0f4d9d34"} Apr 17 17:26:21.697105 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:21.696821 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-88mpn" event={"ID":"7f76d408-5470-4147-8801-3af404c3630b","Type":"ContainerStarted","Data":"b837e32386e56fe560df06bbeebba15cbac28d10e9c78f9b4d55e52621b71575"} Apr 17 17:26:21.697907 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:21.697887 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" event={"ID":"6dc52916-462a-4208-9837-dd6fbb842e70","Type":"ContainerStarted","Data":"d70805c1db580b0a37dd6cf94ee4ce63cb687b2984174bdf5512f1d0e58a7523"} Apr 17 17:26:21.697963 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:21.697914 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" event={"ID":"6dc52916-462a-4208-9837-dd6fbb842e70","Type":"ContainerStarted","Data":"41c60a60dd3047990e4206777592ee7a087b4ce16d13524f0bbcfa6a6bc04ede"} Apr 17 17:26:21.698010 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:21.697999 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:26:21.716766 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:21.716725 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" podStartSLOduration=1.7167100450000001 podStartE2EDuration="1.716710045s" podCreationTimestamp="2026-04-17 17:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:21.715949857 +0000 UTC m=+142.095293671" watchObservedRunningTime="2026-04-17 17:26:21.716710045 +0000 UTC m=+142.096053858" Apr 17 17:26:22.706386 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:22.706343 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-88mpn" event={"ID":"7f76d408-5470-4147-8801-3af404c3630b","Type":"ContainerStarted","Data":"082cd843d3836c3b78e9297f44b491b969070470404ffc20f5d081b7febd4ead"} Apr 17 17:26:23.710845 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:23.710803 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-88mpn" event={"ID":"7f76d408-5470-4147-8801-3af404c3630b","Type":"ContainerStarted","Data":"6848af2de7d68088dee19c20c82ce6a6c705eafaaf19484f8880630d917ada4e"} Apr 17 17:26:23.729893 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:23.729844 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-88mpn" podStartSLOduration=1.791024528 podStartE2EDuration="3.729830189s" podCreationTimestamp="2026-04-17 17:26:20 +0000 UTC" firstStartedPulling="2026-04-17 17:26:21.158570296 +0000 UTC m=+141.537914086" lastFinishedPulling="2026-04-17 17:26:23.097375943 +0000 UTC m=+143.476719747" observedRunningTime="2026-04-17 17:26:23.729624931 +0000 UTC m=+144.108968745" watchObservedRunningTime="2026-04-17 17:26:23.729830189 +0000 UTC m=+144.109174001" Apr 17 17:26:27.362127 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:27.362060 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" podUID="4aa0a085-d8e6-4779-b093-a05f445a240b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:26:35.494479 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:35.494437 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qbjvx" podUID="5d8aa389-0914-474c-8a7a-11a05cb1ee33" Apr 17 17:26:35.509574 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:35.509541 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-szcn9" podUID="e067e25c-97d9-490a-b6a9-e17550577b5e" Apr 17 17:26:35.743689 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:35.743663 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:26:35.743689 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:35.743676 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qbjvx" Apr 17 17:26:37.264445 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:37.264403 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zcn2s" podUID="82dffe03-2b0c-4ac4-bb02-5a8430704805" Apr 17 17:26:37.362658 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:37.362613 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" podUID="4aa0a085-d8e6-4779-b093-a05f445a240b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:26:39.987440 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.987416 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xtltb"] Apr 17 17:26:39.990295 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.990280 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:39.993535 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.993515 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:26:39.993760 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.993745 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:26:39.993863 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.993760 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:26:39.993863 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.993770 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:26:39.993863 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.993829 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s79cd\"" Apr 17 17:26:39.993863 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.993854 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:26:39.994618 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:39.994603 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:26:40.112965 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.112940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-wtmp\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.113076 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.112978 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-root\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.113076 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.113002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.113167 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.113073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-sys\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.113167 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.113102 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-tls\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.113167 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.113136 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/783c23c8-2363-4b4e-bc25-560321d31f0d-metrics-client-ca\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.113272 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.113169 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r779j\" (UniqueName: \"kubernetes.io/projected/783c23c8-2363-4b4e-bc25-560321d31f0d-kube-api-access-r779j\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.113272 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.113189 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-textfile\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.113272 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.113204 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214196 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-sys\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214307 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214199 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-tls\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214307 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/783c23c8-2363-4b4e-bc25-560321d31f0d-metrics-client-ca\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214307 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r779j\" (UniqueName: \"kubernetes.io/projected/783c23c8-2363-4b4e-bc25-560321d31f0d-kube-api-access-r779j\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214307 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214273 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-textfile\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214307 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-sys\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214493 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:40.214323 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:26:40.214493 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:40.214379 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-tls podName:783c23c8-2363-4b4e-bc25-560321d31f0d nodeName:}" failed. No retries permitted until 2026-04-17 17:26:40.714363302 +0000 UTC m=+161.093707093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-tls") pod "node-exporter-xtltb" (UID: "783c23c8-2363-4b4e-bc25-560321d31f0d") : secret "node-exporter-tls" not found Apr 17 17:26:40.214493 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214493 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-wtmp\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214493 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214451 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-root\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214493 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214769 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214571 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-root\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214769 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214608 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-wtmp\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214769 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214631 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-textfile\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214911 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/783c23c8-2363-4b4e-bc25-560321d31f0d-metrics-client-ca\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.214968 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.214917 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.216766 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.216748 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.222686 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.222661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r779j\" (UniqueName: \"kubernetes.io/projected/783c23c8-2363-4b4e-bc25-560321d31f0d-kube-api-access-r779j\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.415104 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.415073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:26:40.415263 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.415124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:26:40.417564 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.417535 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d8aa389-0914-474c-8a7a-11a05cb1ee33-metrics-tls\") pod \"dns-default-qbjvx\" (UID: \"5d8aa389-0914-474c-8a7a-11a05cb1ee33\") " pod="openshift-dns/dns-default-qbjvx" Apr 17 17:26:40.417697 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.417678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e067e25c-97d9-490a-b6a9-e17550577b5e-cert\") pod \"ingress-canary-szcn9\" (UID: \"e067e25c-97d9-490a-b6a9-e17550577b5e\") " pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:26:40.547063 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.547030 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hd29f\"" Apr 17 17:26:40.547189 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.547109 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qbw75\"" Apr 17 17:26:40.554358 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.554341 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qbjvx" Apr 17 17:26:40.554446 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.554408 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-szcn9" Apr 17 17:26:40.678621 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.678562 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-szcn9"] Apr 17 17:26:40.681390 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:26:40.681349 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode067e25c_97d9_490a_b6a9_e17550577b5e.slice/crio-0ccd65218c9bdfc7a54470c4437e890c3af3ab4d4e1ff9a4e0e391b7a20008ae WatchSource:0}: Error finding container 0ccd65218c9bdfc7a54470c4437e890c3af3ab4d4e1ff9a4e0e391b7a20008ae: Status 404 returned error can't find the container with id 0ccd65218c9bdfc7a54470c4437e890c3af3ab4d4e1ff9a4e0e391b7a20008ae Apr 17 17:26:40.694269 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.694248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qbjvx"] Apr 17 17:26:40.696900 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:26:40.696878 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d8aa389_0914_474c_8a7a_11a05cb1ee33.slice/crio-727c7f1c80a5beed59568539e649825a2f0f07c6253fc6894c698a8c3367595e WatchSource:0}: Error finding container 727c7f1c80a5beed59568539e649825a2f0f07c6253fc6894c698a8c3367595e: Status 404 returned error can't find the container with id 727c7f1c80a5beed59568539e649825a2f0f07c6253fc6894c698a8c3367595e Apr 17 17:26:40.717454 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.717428 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-tls\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.719431 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.719408 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/783c23c8-2363-4b4e-bc25-560321d31f0d-node-exporter-tls\") pod \"node-exporter-xtltb\" (UID: \"783c23c8-2363-4b4e-bc25-560321d31f0d\") " pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.756977 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.756948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-szcn9" event={"ID":"e067e25c-97d9-490a-b6a9-e17550577b5e","Type":"ContainerStarted","Data":"0ccd65218c9bdfc7a54470c4437e890c3af3ab4d4e1ff9a4e0e391b7a20008ae"} Apr 17 17:26:40.757900 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.757878 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbjvx" event={"ID":"5d8aa389-0914-474c-8a7a-11a05cb1ee33","Type":"ContainerStarted","Data":"727c7f1c80a5beed59568539e649825a2f0f07c6253fc6894c698a8c3367595e"} Apr 17 17:26:40.898581 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.898555 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtltb" Apr 17 17:26:40.907662 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:26:40.907638 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783c23c8_2363_4b4e_bc25_560321d31f0d.slice/crio-2e7f44115d59233ad2057c67e297db860c11c27a3c916c575be4734b4c5bbad9 WatchSource:0}: Error finding container 2e7f44115d59233ad2057c67e297db860c11c27a3c916c575be4734b4c5bbad9: Status 404 returned error can't find the container with id 2e7f44115d59233ad2057c67e297db860c11c27a3c916c575be4734b4c5bbad9 Apr 17 17:26:40.945832 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.945782 2570 patch_prober.go:28] interesting pod/image-registry-76d6f5448c-dqvzx container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:26:40.945832 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:40.945821 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" podUID="6dc52916-462a-4208-9837-dd6fbb842e70" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:26:41.761570 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:41.761530 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtltb" event={"ID":"783c23c8-2363-4b4e-bc25-560321d31f0d","Type":"ContainerStarted","Data":"2e7f44115d59233ad2057c67e297db860c11c27a3c916c575be4734b4c5bbad9"} Apr 17 17:26:42.711338 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:42.711306 2570 patch_prober.go:28] interesting pod/image-registry-76d6f5448c-dqvzx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:26:42.711485 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:42.711368 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" podUID="6dc52916-462a-4208-9837-dd6fbb842e70" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:26:42.765576 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:42.765544 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbjvx" event={"ID":"5d8aa389-0914-474c-8a7a-11a05cb1ee33","Type":"ContainerStarted","Data":"8d85f67bdeba2fe22f739fe62814f68bdce58e01ccdb0594984aa89a5f339056"} Apr 17 17:26:42.765576 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:42.765577 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbjvx" event={"ID":"5d8aa389-0914-474c-8a7a-11a05cb1ee33","Type":"ContainerStarted","Data":"ed953009f1daea3d29b369906558d1c8d36016b93a19f2b1882fcd9ae8cf2cd1"} Apr 17 17:26:42.765955 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:42.765690 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qbjvx" Apr 17 17:26:42.785359 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:42.785315 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qbjvx" podStartSLOduration=129.634236662 podStartE2EDuration="2m10.785301526s" podCreationTimestamp="2026-04-17 17:24:32 +0000 UTC" firstStartedPulling="2026-04-17 17:26:40.698461821 +0000 UTC m=+161.077805611" lastFinishedPulling="2026-04-17 17:26:41.849526669 +0000 UTC m=+162.228870475" observedRunningTime="2026-04-17 17:26:42.784474396 +0000 UTC m=+163.163818208" watchObservedRunningTime="2026-04-17 17:26:42.785301526 +0000 UTC m=+163.164645339" Apr 17 17:26:44.164791 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.164757 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9"] Apr 17 17:26:44.167760 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.167744 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.171290 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.171237 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 17:26:44.171290 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.171251 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 17:26:44.171290 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.171267 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-mcwrp\"" Apr 17 17:26:44.171290 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.171281 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:26:44.171534 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.171261 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 17:26:44.171534 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.171251 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-3b57ju6hc0pp3\"" Apr 17 17:26:44.178748 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.178725 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9"] Apr 17 17:26:44.347042 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.347003 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grj8\" (UniqueName: \"kubernetes.io/projected/8ffbee18-703d-4c32-ac71-d0a0076a1b98-kube-api-access-7grj8\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.347212 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.347059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-secret-metrics-server-client-certs\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.347212 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.347087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-client-ca-bundle\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.347212 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.347127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-secret-metrics-server-tls\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.347340 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.347221 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ffbee18-703d-4c32-ac71-d0a0076a1b98-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.347340 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.347291 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8ffbee18-703d-4c32-ac71-d0a0076a1b98-metrics-server-audit-profiles\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.347340 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.347332 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8ffbee18-703d-4c32-ac71-d0a0076a1b98-audit-log\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.448403 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.448331 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8ffbee18-703d-4c32-ac71-d0a0076a1b98-audit-log\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.448403 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.448378 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7grj8\" (UniqueName: \"kubernetes.io/projected/8ffbee18-703d-4c32-ac71-d0a0076a1b98-kube-api-access-7grj8\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.448572 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.448416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-secret-metrics-server-client-certs\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.448572 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.448456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-client-ca-bundle\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.448572 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.448497 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-secret-metrics-server-tls\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.448572 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.448527 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ffbee18-703d-4c32-ac71-d0a0076a1b98-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.448759 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.448582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8ffbee18-703d-4c32-ac71-d0a0076a1b98-metrics-server-audit-profiles\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.448810 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.448764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8ffbee18-703d-4c32-ac71-d0a0076a1b98-audit-log\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.449377 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.449344 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ffbee18-703d-4c32-ac71-d0a0076a1b98-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.450049 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.450015 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8ffbee18-703d-4c32-ac71-d0a0076a1b98-metrics-server-audit-profiles\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.451154 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.451123 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-client-ca-bundle\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.451809 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.451783 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-secret-metrics-server-tls\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.451862 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.451819 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8ffbee18-703d-4c32-ac71-d0a0076a1b98-secret-metrics-server-client-certs\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.466543 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.466509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grj8\" (UniqueName: \"kubernetes.io/projected/8ffbee18-703d-4c32-ac71-d0a0076a1b98-kube-api-access-7grj8\") pod \"metrics-server-6f6c8f44fb-k2cn9\" (UID: \"8ffbee18-703d-4c32-ac71-d0a0076a1b98\") " pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.476333 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.476312 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:26:44.591182 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.591138 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9"] Apr 17 17:26:44.595013 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:26:44.594984 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ffbee18_703d_4c32_ac71_d0a0076a1b98.slice/crio-2e117d18bcf01d9a4c72ee32fc901af7f971dce97bbadc980cde7f69e8d6c2a4 WatchSource:0}: Error finding container 2e117d18bcf01d9a4c72ee32fc901af7f971dce97bbadc980cde7f69e8d6c2a4: Status 404 returned error can't find the container with id 2e117d18bcf01d9a4c72ee32fc901af7f971dce97bbadc980cde7f69e8d6c2a4 Apr 17 17:26:44.771826 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:44.771799 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" event={"ID":"8ffbee18-703d-4c32-ac71-d0a0076a1b98","Type":"ContainerStarted","Data":"2e117d18bcf01d9a4c72ee32fc901af7f971dce97bbadc980cde7f69e8d6c2a4"} Apr 17 17:26:46.223436 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.223371 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:46.226662 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.226641 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.230975 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.230928 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:26:46.231841 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.231820 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:26:46.232660 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.232630 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:26:46.232751 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.232710 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:26:46.233524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.232946 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:26:46.233524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.233057 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-59xwb\"" Apr 17 17:26:46.233524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.233173 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:26:46.233524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.233258 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:26:46.233524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.233322 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:26:46.233524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.233390 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:26:46.233950 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.233928 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:26:46.234042 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.233966 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:26:46.234366 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.234350 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-59vkuph86ed6i\"" Apr 17 17:26:46.238456 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.238436 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:26:46.253069 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.253052 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:46.366121 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366095 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366220 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366129 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366261 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366261 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366248 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366321 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366271 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366321 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366321 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366313 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-config-out\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366408 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-web-config\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366408 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366408 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366393 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366426 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366478 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-config\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366607 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366512 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366607 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366541 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366607 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366560 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qxfb\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-kube-api-access-8qxfb\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.366607 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.366579 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467596 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467568 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467675 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467605 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467675 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467675 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467666 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467790 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467790 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467723 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467790 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467748 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-config-out\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467790 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467770 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-web-config\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467973 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467798 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467973 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467823 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467973 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467848 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467973 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467882 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467973 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467973 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-config\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.467973 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.468313 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.467988 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.468313 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.468014 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qxfb\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-kube-api-access-8qxfb\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.468313 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.468042 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.468791 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.468741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.468868 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.468777 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.469435 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.469412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.470267 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.470204 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.472851 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.470661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.472851 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.470914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.472851 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.470970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-config-out\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.472851 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.471484 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.472851 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.471638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.473465 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.473349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.473776 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.473752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.473871 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.473854 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-web-config\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.473936 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.473879 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-config\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.473936 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.473889 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.474159 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.474112 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.474268 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.474251 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.475306 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.475283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.479998 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.479972 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qxfb\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-kube-api-access-8qxfb\") pod \"prometheus-k8s-0\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.536933 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.536908 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:26:46.680740 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.680623 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:26:46.683066 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:26:46.683042 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75de9fb_e653_480b_868f_b20405e94293.slice/crio-ae8030297004e77494b9a5f21aa39d32d6783ef37cb2c58f3581f7a9468006c8 WatchSource:0}: Error finding container ae8030297004e77494b9a5f21aa39d32d6783ef37cb2c58f3581f7a9468006c8: Status 404 returned error can't find the container with id ae8030297004e77494b9a5f21aa39d32d6783ef37cb2c58f3581f7a9468006c8 Apr 17 17:26:46.783313 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.783277 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" event={"ID":"8ffbee18-703d-4c32-ac71-d0a0076a1b98","Type":"ContainerStarted","Data":"ff373d0088f6589850d91d44cb44e5f46f7ed9b9bc4924c0ee37e4c20f3db60b"} Apr 17 17:26:46.784289 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.784270 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerStarted","Data":"ae8030297004e77494b9a5f21aa39d32d6783ef37cb2c58f3581f7a9468006c8"} Apr 17 17:26:46.803269 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:46.803231 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" podStartSLOduration=1.445092761 podStartE2EDuration="2.803218905s" podCreationTimestamp="2026-04-17 17:26:44 +0000 UTC" firstStartedPulling="2026-04-17 17:26:44.596805635 +0000 UTC m=+164.976149425" lastFinishedPulling="2026-04-17 17:26:45.954931775 +0000 UTC m=+166.334275569" observedRunningTime="2026-04-17 17:26:46.801956764 +0000 UTC m=+167.181300576" watchObservedRunningTime="2026-04-17 17:26:46.803218905 +0000 UTC m=+167.182562709" Apr 17 17:26:47.362786 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.362752 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" podUID="4aa0a085-d8e6-4779-b093-a05f445a240b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:26:47.363224 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.362821 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" Apr 17 17:26:47.363397 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.363366 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"545a2627d1ff9c39ab987bf43cfef4db825e581121feda9b460410b1a596c681"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 17:26:47.363453 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.363437 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" podUID="4aa0a085-d8e6-4779-b093-a05f445a240b" containerName="service-proxy" containerID="cri-o://545a2627d1ff9c39ab987bf43cfef4db825e581121feda9b460410b1a596c681" gracePeriod=30 Apr 17 17:26:47.790391 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.790362 2570 generic.go:358] "Generic (PLEG): container finished" podID="c75de9fb-e653-480b-868f-b20405e94293" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" exitCode=0 Apr 17 17:26:47.790489 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.790450 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerDied","Data":"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9"} Apr 17 17:26:47.796747 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.796724 2570 generic.go:358] "Generic (PLEG): container finished" podID="4aa0a085-d8e6-4779-b093-a05f445a240b" containerID="545a2627d1ff9c39ab987bf43cfef4db825e581121feda9b460410b1a596c681" exitCode=2 Apr 17 17:26:47.796838 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.796765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" event={"ID":"4aa0a085-d8e6-4779-b093-a05f445a240b","Type":"ContainerDied","Data":"545a2627d1ff9c39ab987bf43cfef4db825e581121feda9b460410b1a596c681"} Apr 17 17:26:47.796838 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:47.796812 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c46845d64-57n44" event={"ID":"4aa0a085-d8e6-4779-b093-a05f445a240b","Type":"ContainerStarted","Data":"b96e1b2d0c790b68d3537d43131ee2b3a9f9eff1481d0d680617c2e6f3956c62"} Apr 17 17:26:50.806054 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:50.806023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerStarted","Data":"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6"} Apr 17 17:26:50.806054 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:50.806056 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerStarted","Data":"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189"} Apr 17 17:26:50.931680 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:50.931632 2570 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98c903cc8c2ea492f4c9047febaa42525c20c5b414d4de2ce3df5eb65ef899e2: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98c903cc8c2ea492f4c9047febaa42525c20c5b414d4de2ce3df5eb65ef899e2" Apr 17 17:26:50.931893 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:50.931857 2570 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:serve-healthcheck-canary,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98c903cc8c2ea492f4c9047febaa42525c20c5b414d4de2ce3df5eb65ef899e2,Command:[ingress-operator serve-healthcheck],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:,HostPort:0,ContainerPort:8888,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT,Value:/etc/tls-cert/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY,Value:/etc/tls-cert/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:false,MountPath:/etc/tls-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6dm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000320000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ingress-canary-szcn9_openshift-ingress-canary(e067e25c-97d9-490a-b6a9-e17550577b5e): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98c903cc8c2ea492f4c9047febaa42525c20c5b414d4de2ce3df5eb65ef899e2: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:26:50.933013 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:50.932988 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"serve-healthcheck-canary\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98c903cc8c2ea492f4c9047febaa42525c20c5b414d4de2ce3df5eb65ef899e2: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-ingress-canary/ingress-canary-szcn9" podUID="e067e25c-97d9-490a-b6a9-e17550577b5e" Apr 17 17:26:50.946122 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:50.946054 2570 patch_prober.go:28] interesting pod/image-registry-76d6f5448c-dqvzx container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:26:50.946122 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:50.946093 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" podUID="6dc52916-462a-4208-9837-dd6fbb842e70" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:26:51.202340 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:51.202251 2570 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1: reading manifest sha256:d87b9fedbc92cc502b5f435d9d5798507256bad49eda2040ac3645623616b5f5 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1" Apr 17 17:26:51.202608 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:51.202556 2570 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:init-textfile,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1,Command:[/bin/sh -c [[ ! -d /node_exporter/collectors/init ]] || find /node_exporter/collectors/init -perm /111 -type f -exec {} \\;],Args:[],WorkingDir:/var/node_exporter/textfile,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMPDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{1 -3} {} 1m DecimalSI},memory: {{1048576 0} {} 1Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:node-exporter-textfile,ReadOnly:false,MountPath:/var/node_exporter/textfile,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:node-exporter-wtmp,ReadOnly:true,MountPath:/var/log/wtmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r779j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-exporter-xtltb_openshift-monitoring(783c23c8-2363-4b4e-bc25-560321d31f0d): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1: reading manifest sha256:d87b9fedbc92cc502b5f435d9d5798507256bad49eda2040ac3645623616b5f5 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:26:51.204427 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:51.204394 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-textfile\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1: reading manifest sha256:d87b9fedbc92cc502b5f435d9d5798507256bad49eda2040ac3645623616b5f5 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-monitoring/node-exporter-xtltb" podUID="783c23c8-2363-4b4e-bc25-560321d31f0d" Apr 17 17:26:51.249396 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:51.249364 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:26:51.810068 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:51.809845 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"serve-healthcheck-canary\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98c903cc8c2ea492f4c9047febaa42525c20c5b414d4de2ce3df5eb65ef899e2\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98c903cc8c2ea492f4c9047febaa42525c20c5b414d4de2ce3df5eb65ef899e2: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-ingress-canary/ingress-canary-szcn9" podUID="e067e25c-97d9-490a-b6a9-e17550577b5e" Apr 17 17:26:51.810068 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:26:51.810016 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-textfile\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1: reading manifest sha256:d87b9fedbc92cc502b5f435d9d5798507256bad49eda2040ac3645623616b5f5 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-monitoring/node-exporter-xtltb" podUID="783c23c8-2363-4b4e-bc25-560321d31f0d" Apr 17 17:26:52.711088 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:52.711055 2570 patch_prober.go:28] interesting pod/image-registry-76d6f5448c-dqvzx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:26:52.711272 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:52.711115 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" podUID="6dc52916-462a-4208-9837-dd6fbb842e70" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:26:52.770269 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:52.770243 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qbjvx" Apr 17 17:26:52.815210 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:52.815184 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerStarted","Data":"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0"} Apr 17 17:26:52.815537 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:52.815216 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerStarted","Data":"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775"} Apr 17 17:26:52.815537 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:52.815228 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerStarted","Data":"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1"} Apr 17 17:26:52.815537 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:52.815241 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerStarted","Data":"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8"} Apr 17 17:26:52.841519 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:52.841272 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.288888637 podStartE2EDuration="6.841254653s" podCreationTimestamp="2026-04-17 17:26:46 +0000 UTC" firstStartedPulling="2026-04-17 17:26:46.684825692 +0000 UTC m=+167.064169484" lastFinishedPulling="2026-04-17 17:26:52.237191691 +0000 UTC m=+172.616535500" observedRunningTime="2026-04-17 17:26:52.840216484 +0000 UTC m=+173.219560297" watchObservedRunningTime="2026-04-17 17:26:52.841254653 +0000 UTC m=+173.220598466" Apr 17 17:26:56.537848 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:26:56.537807 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:00.945521 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:00.945482 2570 patch_prober.go:28] interesting pod/image-registry-76d6f5448c-dqvzx container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:27:00.945925 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:00.945547 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" podUID="6dc52916-462a-4208-9837-dd6fbb842e70" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:27:00.945925 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:00.945587 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:27:00.946050 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:00.945987 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"d70805c1db580b0a37dd6cf94ee4ce63cb687b2984174bdf5512f1d0e58a7523"} pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" containerMessage="Container registry failed liveness probe, will be restarted" Apr 17 17:27:00.949259 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:00.949234 2570 patch_prober.go:28] interesting pod/image-registry-76d6f5448c-dqvzx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:27:00.949390 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:00.949281 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" podUID="6dc52916-462a-4208-9837-dd6fbb842e70" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:27:02.029801 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:02.029765 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qbjvx_5d8aa389-0914-474c-8a7a-11a05cb1ee33/dns/0.log" Apr 17 17:27:02.229131 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:02.229100 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qbjvx_5d8aa389-0914-474c-8a7a-11a05cb1ee33/kube-rbac-proxy/0.log" Apr 17 17:27:02.829602 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:02.829580 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b6njn_e75ff2ec-f311-467f-a5b6-86322293f3ed/dns-node-resolver/0.log" Apr 17 17:27:04.477378 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:04.477341 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:27:04.477378 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:04.477388 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:27:05.852128 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:05.852088 2570 generic.go:358] "Generic (PLEG): container finished" podID="783c23c8-2363-4b4e-bc25-560321d31f0d" containerID="f5c78a95b285839a4c3156ddf95a9ff6369e328ccf85a83cffa3aff46d381ed6" exitCode=0 Apr 17 17:27:05.852642 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:05.852189 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtltb" event={"ID":"783c23c8-2363-4b4e-bc25-560321d31f0d","Type":"ContainerDied","Data":"f5c78a95b285839a4c3156ddf95a9ff6369e328ccf85a83cffa3aff46d381ed6"} Apr 17 17:27:06.856514 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:06.856480 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtltb" event={"ID":"783c23c8-2363-4b4e-bc25-560321d31f0d","Type":"ContainerStarted","Data":"859599444e377adc284472d21f7cad93215a147f117de8a43dc5549d292a88f8"} Apr 17 17:27:06.856514 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:06.856520 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtltb" event={"ID":"783c23c8-2363-4b4e-bc25-560321d31f0d","Type":"ContainerStarted","Data":"dc387725ca9996fc525c0a746be40a18ab62862c024a80ea43d3a0d09b742c0e"} Apr 17 17:27:06.876524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:06.876471 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xtltb" podStartSLOduration=3.87832914 podStartE2EDuration="27.876456349s" podCreationTimestamp="2026-04-17 17:26:39 +0000 UTC" firstStartedPulling="2026-04-17 17:26:40.909133959 +0000 UTC m=+161.288477750" lastFinishedPulling="2026-04-17 17:27:04.907261152 +0000 UTC m=+185.286604959" observedRunningTime="2026-04-17 17:27:06.875209286 +0000 UTC m=+187.254553099" watchObservedRunningTime="2026-04-17 17:27:06.876456349 +0000 UTC m=+187.255800161" Apr 17 17:27:09.865211 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:09.865173 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-szcn9" event={"ID":"e067e25c-97d9-490a-b6a9-e17550577b5e","Type":"ContainerStarted","Data":"453b3ef9b844a20bbab2804ffb47bed3b119a7b0d48033b216f6c44b31c66703"} Apr 17 17:27:09.882974 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:09.882913 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-szcn9" podStartSLOduration=129.751326259 podStartE2EDuration="2m37.882897627s" podCreationTimestamp="2026-04-17 17:24:32 +0000 UTC" firstStartedPulling="2026-04-17 17:26:40.685415591 +0000 UTC m=+161.064759382" lastFinishedPulling="2026-04-17 17:27:08.816986946 +0000 UTC m=+189.196330750" observedRunningTime="2026-04-17 17:27:09.882346018 +0000 UTC m=+190.261689834" watchObservedRunningTime="2026-04-17 17:27:09.882897627 +0000 UTC m=+190.262241438" Apr 17 17:27:10.949746 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:10.949716 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:27:24.482209 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:24.482171 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:27:24.486271 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:24.486245 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6f6c8f44fb-k2cn9" Apr 17 17:27:25.963899 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:25.963860 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" podUID="6dc52916-462a-4208-9837-dd6fbb842e70" containerName="registry" containerID="cri-o://d70805c1db580b0a37dd6cf94ee4ce63cb687b2984174bdf5512f1d0e58a7523" gracePeriod=30 Apr 17 17:27:27.915218 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:27.915186 2570 generic.go:358] "Generic (PLEG): container finished" podID="6dc52916-462a-4208-9837-dd6fbb842e70" containerID="d70805c1db580b0a37dd6cf94ee4ce63cb687b2984174bdf5512f1d0e58a7523" exitCode=0 Apr 17 17:27:27.915621 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:27.915277 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" event={"ID":"6dc52916-462a-4208-9837-dd6fbb842e70","Type":"ContainerDied","Data":"d70805c1db580b0a37dd6cf94ee4ce63cb687b2984174bdf5512f1d0e58a7523"} Apr 17 17:27:27.915621 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:27.915316 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" event={"ID":"6dc52916-462a-4208-9837-dd6fbb842e70","Type":"ContainerStarted","Data":"3b2972ae720c4ba13440fea7d3ca9f1a384aebcb09a3672f5a8c32f00c444b1a"} Apr 17 17:27:27.915621 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:27.915352 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:27:46.537214 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:46.537169 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:46.557961 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:46.557934 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:46.983574 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:46.983502 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:27:48.922042 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:27:48.922012 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76d6f5448c-dqvzx" Apr 17 17:28:04.578677 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.578638 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:28:04.579190 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.579086 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="prometheus" containerID="cri-o://4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" gracePeriod=600 Apr 17 17:28:04.579190 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.579115 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy" containerID="cri-o://bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" gracePeriod=600 Apr 17 17:28:04.579339 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.579173 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy-thanos" containerID="cri-o://4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" gracePeriod=600 Apr 17 17:28:04.579339 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.579164 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="thanos-sidecar" containerID="cri-o://553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" gracePeriod=600 Apr 17 17:28:04.579339 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.579228 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy-web" containerID="cri-o://a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" gracePeriod=600 Apr 17 17:28:04.579339 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.579235 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="config-reloader" containerID="cri-o://b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" gracePeriod=600 Apr 17 17:28:04.808758 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.808737 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:04.936325 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936253 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936325 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936303 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-config-out\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936336 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-thanos-prometheus-http-client-file\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936370 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-web-config\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936394 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-config\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936422 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-metrics-client-ca\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936451 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-trusted-ca-bundle\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936482 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-grpc-tls\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936524 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936513 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-rulefiles-0\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936544 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-metrics-client-certs\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936573 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-db\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936598 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-serving-certs-ca-bundle\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936648 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-tls-assets\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936683 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936725 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qxfb\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-kube-api-access-8qxfb\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936760 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-kube-rbac-proxy\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936786 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-tls\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.936880 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.936813 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-kubelet-serving-ca-bundle\") pod \"c75de9fb-e653-480b-868f-b20405e94293\" (UID: \"c75de9fb-e653-480b-868f-b20405e94293\") " Apr 17 17:28:04.937334 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.937238 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:04.937389 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.937339 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:04.937459 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.937397 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:04.938796 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.937795 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:04.938796 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.938775 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:04.939076 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.939047 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:04.939989 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.939816 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:04.939989 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.939878 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-config-out" (OuterVolumeSpecName: "config-out") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:04.939989 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.939902 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:04.939989 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.939961 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:04.941070 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.941035 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:04.941233 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.941191 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-kube-api-access-8qxfb" (OuterVolumeSpecName: "kube-api-access-8qxfb") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "kube-api-access-8qxfb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:04.941369 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.941336 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:04.941457 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.941360 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:04.941457 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.941444 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:04.941696 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.941651 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-config" (OuterVolumeSpecName: "config") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:04.941784 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.941716 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:04.948586 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:04.948568 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-web-config" (OuterVolumeSpecName: "web-config") pod "c75de9fb-e653-480b-868f-b20405e94293" (UID: "c75de9fb-e653-480b-868f-b20405e94293"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:05.016059 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016037 2570 generic.go:358] "Generic (PLEG): container finished" podID="c75de9fb-e653-480b-868f-b20405e94293" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" exitCode=0 Apr 17 17:28:05.016059 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016057 2570 generic.go:358] "Generic (PLEG): container finished" podID="c75de9fb-e653-480b-868f-b20405e94293" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" exitCode=0 Apr 17 17:28:05.016185 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016064 2570 generic.go:358] "Generic (PLEG): container finished" podID="c75de9fb-e653-480b-868f-b20405e94293" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" exitCode=0 Apr 17 17:28:05.016185 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016070 2570 generic.go:358] "Generic (PLEG): container finished" podID="c75de9fb-e653-480b-868f-b20405e94293" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" exitCode=0 Apr 17 17:28:05.016185 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016075 2570 generic.go:358] "Generic (PLEG): container finished" podID="c75de9fb-e653-480b-868f-b20405e94293" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" exitCode=0 Apr 17 17:28:05.016185 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016079 2570 generic.go:358] "Generic (PLEG): container finished" podID="c75de9fb-e653-480b-868f-b20405e94293" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" exitCode=0 Apr 17 17:28:05.016185 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016117 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerDied","Data":"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0"} Apr 17 17:28:05.016185 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016155 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.016185 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016173 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerDied","Data":"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775"} Apr 17 17:28:05.016185 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016186 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerDied","Data":"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1"} Apr 17 17:28:05.016421 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016196 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerDied","Data":"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8"} Apr 17 17:28:05.016421 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016205 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerDied","Data":"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6"} Apr 17 17:28:05.016421 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerDied","Data":"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189"} Apr 17 17:28:05.016421 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016223 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c75de9fb-e653-480b-868f-b20405e94293","Type":"ContainerDied","Data":"ae8030297004e77494b9a5f21aa39d32d6783ef37cb2c58f3581f7a9468006c8"} Apr 17 17:28:05.016421 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.016236 2570 scope.go:117] "RemoveContainer" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" Apr 17 17:28:05.025953 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.025930 2570 scope.go:117] "RemoveContainer" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" Apr 17 17:28:05.034290 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.034274 2570 scope.go:117] "RemoveContainer" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" Apr 17 17:28:05.037335 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037319 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-metrics-client-certs\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037338 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-db\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037348 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037361 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-tls-assets\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037373 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037382 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qxfb\" (UniqueName: \"kubernetes.io/projected/c75de9fb-e653-480b-868f-b20405e94293-kube-api-access-8qxfb\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037391 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-kube-rbac-proxy\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037400 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-tls\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037406 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037418 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037428 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037440 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c75de9fb-e653-480b-868f-b20405e94293-config-out\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037452 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-thanos-prometheus-http-client-file\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037460 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-web-config\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037467 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-config\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037475 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-configmap-metrics-client-ca\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037483 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-trusted-ca-bundle\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037495 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c75de9fb-e653-480b-868f-b20405e94293-secret-grpc-tls\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.037692 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.037507 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c75de9fb-e653-480b-868f-b20405e94293-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 17:28:05.039940 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.039918 2570 scope.go:117] "RemoveContainer" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" Apr 17 17:28:05.044646 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.044625 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:28:05.046294 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.046271 2570 scope.go:117] "RemoveContainer" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" Apr 17 17:28:05.050549 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.050531 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:28:05.052050 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.052034 2570 scope.go:117] "RemoveContainer" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" Apr 17 17:28:05.058018 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.057999 2570 scope.go:117] "RemoveContainer" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" Apr 17 17:28:05.063475 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.063461 2570 scope.go:117] "RemoveContainer" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" Apr 17 17:28:05.063724 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:28:05.063706 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": container with ID starting with 4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0 not found: ID does not exist" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" Apr 17 17:28:05.063766 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.063731 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0"} err="failed to get container status \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": rpc error: code = NotFound desc = could not find container \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": container with ID starting with 4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0 not found: ID does not exist" Apr 17 17:28:05.063766 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.063749 2570 scope.go:117] "RemoveContainer" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" Apr 17 17:28:05.063967 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:28:05.063949 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": container with ID starting with bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775 not found: ID does not exist" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" Apr 17 17:28:05.064032 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.063977 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775"} err="failed to get container status \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": rpc error: code = NotFound desc = could not find container \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": container with ID starting with bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775 not found: ID does not exist" Apr 17 17:28:05.064032 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064001 2570 scope.go:117] "RemoveContainer" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" Apr 17 17:28:05.064243 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:28:05.064226 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": container with ID starting with a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1 not found: ID does not exist" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" Apr 17 17:28:05.064289 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064247 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1"} err="failed to get container status \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": rpc error: code = NotFound desc = could not find container \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": container with ID starting with a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1 not found: ID does not exist" Apr 17 17:28:05.064289 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064265 2570 scope.go:117] "RemoveContainer" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" Apr 17 17:28:05.064485 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:28:05.064470 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": container with ID starting with 553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8 not found: ID does not exist" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" Apr 17 17:28:05.064529 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064488 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8"} err="failed to get container status \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": rpc error: code = NotFound desc = could not find container \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": container with ID starting with 553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8 not found: ID does not exist" Apr 17 17:28:05.064529 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064501 2570 scope.go:117] "RemoveContainer" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" Apr 17 17:28:05.064670 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:28:05.064654 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": container with ID starting with b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6 not found: ID does not exist" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" Apr 17 17:28:05.064708 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064674 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6"} err="failed to get container status \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": rpc error: code = NotFound desc = could not find container \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": container with ID starting with b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6 not found: ID does not exist" Apr 17 17:28:05.064708 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064686 2570 scope.go:117] "RemoveContainer" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" Apr 17 17:28:05.064857 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:28:05.064844 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": container with ID starting with 4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189 not found: ID does not exist" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" Apr 17 17:28:05.064900 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064860 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189"} err="failed to get container status \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": rpc error: code = NotFound desc = could not find container \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": container with ID starting with 4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189 not found: ID does not exist" Apr 17 17:28:05.064900 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.064872 2570 scope.go:117] "RemoveContainer" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" Apr 17 17:28:05.065028 ip-10-0-136-202 kubenswrapper[2570]: E0417 17:28:05.065008 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": container with ID starting with 3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9 not found: ID does not exist" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" Apr 17 17:28:05.065068 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065035 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9"} err="failed to get container status \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": rpc error: code = NotFound desc = could not find container \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": container with ID starting with 3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9 not found: ID does not exist" Apr 17 17:28:05.065068 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065049 2570 scope.go:117] "RemoveContainer" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" Apr 17 17:28:05.065278 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065256 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0"} err="failed to get container status \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": rpc error: code = NotFound desc = could not find container \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": container with ID starting with 4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0 not found: ID does not exist" Apr 17 17:28:05.065330 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065281 2570 scope.go:117] "RemoveContainer" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" Apr 17 17:28:05.065525 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065509 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775"} err="failed to get container status \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": rpc error: code = NotFound desc = could not find container \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": container with ID starting with bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775 not found: ID does not exist" Apr 17 17:28:05.065581 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065526 2570 scope.go:117] "RemoveContainer" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" Apr 17 17:28:05.065749 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065724 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1"} err="failed to get container status \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": rpc error: code = NotFound desc = could not find container \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": container with ID starting with a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1 not found: ID does not exist" Apr 17 17:28:05.065805 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065749 2570 scope.go:117] "RemoveContainer" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" Apr 17 17:28:05.065949 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065935 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8"} err="failed to get container status \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": rpc error: code = NotFound desc = could not find container \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": container with ID starting with 553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8 not found: ID does not exist" Apr 17 17:28:05.065949 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.065947 2570 scope.go:117] "RemoveContainer" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" Apr 17 17:28:05.066104 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066090 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6"} err="failed to get container status \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": rpc error: code = NotFound desc = could not find container \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": container with ID starting with b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6 not found: ID does not exist" Apr 17 17:28:05.066104 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066103 2570 scope.go:117] "RemoveContainer" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" Apr 17 17:28:05.066265 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066250 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189"} err="failed to get container status \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": rpc error: code = NotFound desc = could not find container \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": container with ID starting with 4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189 not found: ID does not exist" Apr 17 17:28:05.066316 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066265 2570 scope.go:117] "RemoveContainer" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" Apr 17 17:28:05.066465 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066445 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9"} err="failed to get container status \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": rpc error: code = NotFound desc = could not find container \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": container with ID starting with 3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9 not found: ID does not exist" Apr 17 17:28:05.066512 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066466 2570 scope.go:117] "RemoveContainer" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" Apr 17 17:28:05.066654 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066640 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0"} err="failed to get container status \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": rpc error: code = NotFound desc = could not find container \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": container with ID starting with 4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0 not found: ID does not exist" Apr 17 17:28:05.066695 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066655 2570 scope.go:117] "RemoveContainer" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" Apr 17 17:28:05.066809 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066797 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775"} err="failed to get container status \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": rpc error: code = NotFound desc = could not find container \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": container with ID starting with bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775 not found: ID does not exist" Apr 17 17:28:05.066809 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066808 2570 scope.go:117] "RemoveContainer" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" Apr 17 17:28:05.067006 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.066989 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1"} err="failed to get container status \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": rpc error: code = NotFound desc = could not find container \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": container with ID starting with a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1 not found: ID does not exist" Apr 17 17:28:05.067059 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067007 2570 scope.go:117] "RemoveContainer" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" Apr 17 17:28:05.067248 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067230 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8"} err="failed to get container status \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": rpc error: code = NotFound desc = could not find container \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": container with ID starting with 553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8 not found: ID does not exist" Apr 17 17:28:05.067307 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067250 2570 scope.go:117] "RemoveContainer" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" Apr 17 17:28:05.067464 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067444 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6"} err="failed to get container status \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": rpc error: code = NotFound desc = could not find container \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": container with ID starting with b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6 not found: ID does not exist" Apr 17 17:28:05.067464 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067461 2570 scope.go:117] "RemoveContainer" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" Apr 17 17:28:05.067667 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067650 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189"} err="failed to get container status \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": rpc error: code = NotFound desc = could not find container \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": container with ID starting with 4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189 not found: ID does not exist" Apr 17 17:28:05.067707 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067668 2570 scope.go:117] "RemoveContainer" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" Apr 17 17:28:05.067867 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067851 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9"} err="failed to get container status \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": rpc error: code = NotFound desc = could not find container \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": container with ID starting with 3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9 not found: ID does not exist" Apr 17 17:28:05.067910 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.067868 2570 scope.go:117] "RemoveContainer" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" Apr 17 17:28:05.068055 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068041 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0"} err="failed to get container status \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": rpc error: code = NotFound desc = could not find container \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": container with ID starting with 4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0 not found: ID does not exist" Apr 17 17:28:05.068100 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068055 2570 scope.go:117] "RemoveContainer" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" Apr 17 17:28:05.068279 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068262 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775"} err="failed to get container status \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": rpc error: code = NotFound desc = could not find container \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": container with ID starting with bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775 not found: ID does not exist" Apr 17 17:28:05.068338 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068280 2570 scope.go:117] "RemoveContainer" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" Apr 17 17:28:05.068447 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068427 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1"} err="failed to get container status \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": rpc error: code = NotFound desc = could not find container \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": container with ID starting with a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1 not found: ID does not exist" Apr 17 17:28:05.068498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068457 2570 scope.go:117] "RemoveContainer" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" Apr 17 17:28:05.068651 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068635 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8"} err="failed to get container status \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": rpc error: code = NotFound desc = could not find container \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": container with ID starting with 553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8 not found: ID does not exist" Apr 17 17:28:05.068698 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068650 2570 scope.go:117] "RemoveContainer" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" Apr 17 17:28:05.068859 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068842 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6"} err="failed to get container status \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": rpc error: code = NotFound desc = could not find container \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": container with ID starting with b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6 not found: ID does not exist" Apr 17 17:28:05.068906 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.068860 2570 scope.go:117] "RemoveContainer" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" Apr 17 17:28:05.069178 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.069126 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189"} err="failed to get container status \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": rpc error: code = NotFound desc = could not find container \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": container with ID starting with 4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189 not found: ID does not exist" Apr 17 17:28:05.069178 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.069158 2570 scope.go:117] "RemoveContainer" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" Apr 17 17:28:05.069429 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.069396 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9"} err="failed to get container status \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": rpc error: code = NotFound desc = could not find container \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": container with ID starting with 3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9 not found: ID does not exist" Apr 17 17:28:05.069503 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.069430 2570 scope.go:117] "RemoveContainer" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" Apr 17 17:28:05.069721 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.069700 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0"} err="failed to get container status \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": rpc error: code = NotFound desc = could not find container \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": container with ID starting with 4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0 not found: ID does not exist" Apr 17 17:28:05.069804 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.069722 2570 scope.go:117] "RemoveContainer" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" Apr 17 17:28:05.069960 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.069942 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775"} err="failed to get container status \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": rpc error: code = NotFound desc = could not find container \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": container with ID starting with bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775 not found: ID does not exist" Apr 17 17:28:05.070010 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.069962 2570 scope.go:117] "RemoveContainer" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" Apr 17 17:28:05.070179 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.070160 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1"} err="failed to get container status \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": rpc error: code = NotFound desc = could not find container \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": container with ID starting with a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1 not found: ID does not exist" Apr 17 17:28:05.070233 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.070181 2570 scope.go:117] "RemoveContainer" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" Apr 17 17:28:05.070407 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.070390 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8"} err="failed to get container status \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": rpc error: code = NotFound desc = could not find container \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": container with ID starting with 553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8 not found: ID does not exist" Apr 17 17:28:05.070459 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.070409 2570 scope.go:117] "RemoveContainer" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" Apr 17 17:28:05.070611 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.070593 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6"} err="failed to get container status \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": rpc error: code = NotFound desc = could not find container \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": container with ID starting with b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6 not found: ID does not exist" Apr 17 17:28:05.070673 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.070613 2570 scope.go:117] "RemoveContainer" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" Apr 17 17:28:05.070818 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.070799 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189"} err="failed to get container status \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": rpc error: code = NotFound desc = could not find container \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": container with ID starting with 4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189 not found: ID does not exist" Apr 17 17:28:05.070874 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.070820 2570 scope.go:117] "RemoveContainer" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" Apr 17 17:28:05.071032 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071015 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9"} err="failed to get container status \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": rpc error: code = NotFound desc = could not find container \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": container with ID starting with 3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9 not found: ID does not exist" Apr 17 17:28:05.071103 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071034 2570 scope.go:117] "RemoveContainer" containerID="4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0" Apr 17 17:28:05.071276 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071259 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0"} err="failed to get container status \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": rpc error: code = NotFound desc = could not find container \"4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0\": container with ID starting with 4ad4d0113f02741bb3667128bdea3b1413de9248e0714031769ffb840bb385c0 not found: ID does not exist" Apr 17 17:28:05.071349 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071277 2570 scope.go:117] "RemoveContainer" containerID="bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775" Apr 17 17:28:05.071482 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071466 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775"} err="failed to get container status \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": rpc error: code = NotFound desc = could not find container \"bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775\": container with ID starting with bc5186514b53337aa44f1415ed43ee6d671846530dc8f14f00226a973143f775 not found: ID does not exist" Apr 17 17:28:05.071544 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071485 2570 scope.go:117] "RemoveContainer" containerID="a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1" Apr 17 17:28:05.071685 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071670 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1"} err="failed to get container status \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": rpc error: code = NotFound desc = could not find container \"a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1\": container with ID starting with a617ba5ffc4026e3e5bdbe786e143f98ee410080729d058e92df66054ef6f5b1 not found: ID does not exist" Apr 17 17:28:05.071746 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071686 2570 scope.go:117] "RemoveContainer" containerID="553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8" Apr 17 17:28:05.071897 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071867 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8"} err="failed to get container status \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": rpc error: code = NotFound desc = could not find container \"553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8\": container with ID starting with 553731f26c8f98734ad741018513eea41509129aa2a05d82c8a42407f34740a8 not found: ID does not exist" Apr 17 17:28:05.071935 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.071900 2570 scope.go:117] "RemoveContainer" containerID="b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6" Apr 17 17:28:05.072098 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.072081 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6"} err="failed to get container status \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": rpc error: code = NotFound desc = could not find container \"b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6\": container with ID starting with b430e72105853ae26e3bcda712021275d851c0e90b8ebffb9278eda0a750dad6 not found: ID does not exist" Apr 17 17:28:05.072177 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.072099 2570 scope.go:117] "RemoveContainer" containerID="4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189" Apr 17 17:28:05.072347 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.072324 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189"} err="failed to get container status \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": rpc error: code = NotFound desc = could not find container \"4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189\": container with ID starting with 4f36d3ab6dfc66656b98d924445f949141d6a197965d433f249a3aac45c6a189 not found: ID does not exist" Apr 17 17:28:05.072394 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.072349 2570 scope.go:117] "RemoveContainer" containerID="3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9" Apr 17 17:28:05.072578 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.072559 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9"} err="failed to get container status \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": rpc error: code = NotFound desc = could not find container \"3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9\": container with ID starting with 3e9f75d4a98e2ee125ddc65cd53a0851782d967b9d4b84d389a3d24cbd3a91b9 not found: ID does not exist" Apr 17 17:28:05.078255 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078237 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:28:05.078458 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078447 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="config-reloader" Apr 17 17:28:05.078498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078460 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="config-reloader" Apr 17 17:28:05.078498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078469 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="init-config-reloader" Apr 17 17:28:05.078498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078475 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="init-config-reloader" Apr 17 17:28:05.078498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078481 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="thanos-sidecar" Apr 17 17:28:05.078498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078486 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="thanos-sidecar" Apr 17 17:28:05.078498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078495 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="prometheus" Apr 17 17:28:05.078498 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078500 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="prometheus" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078506 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy-web" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078511 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy-web" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078518 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy-thanos" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078522 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy-thanos" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078529 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078534 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078572 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078581 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy-thanos" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078586 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="prometheus" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078591 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="config-reloader" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078596 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="kube-rbac-proxy-web" Apr 17 17:28:05.078690 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.078603 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c75de9fb-e653-480b-868f-b20405e94293" containerName="thanos-sidecar" Apr 17 17:28:05.083426 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.083412 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.085893 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.085869 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:28:05.085893 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.085885 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:28:05.086024 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.085917 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:28:05.086024 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.085977 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:28:05.086250 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.086234 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-59vkuph86ed6i\"" Apr 17 17:28:05.086330 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.086263 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:28:05.086330 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.086277 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:28:05.086631 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.086614 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:28:05.086705 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.086654 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:28:05.086850 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.086837 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:28:05.086925 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.086906 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-59xwb\"" Apr 17 17:28:05.086981 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.086965 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:28:05.089705 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.089688 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:28:05.093659 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.093642 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:28:05.098128 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.098109 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:28:05.239039 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.238979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239039 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-config\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239211 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239054 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239211 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239211 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239116 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239211 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239138 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-web-config\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239211 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239201 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239469 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239224 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfsp\" (UniqueName: \"kubernetes.io/projected/4901fc58-6f53-4175-ab3b-675370742c29-kube-api-access-fmfsp\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239469 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239469 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239317 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239469 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239469 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239383 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239469 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239415 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239469 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239448 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4901fc58-6f53-4175-ab3b-675370742c29-config-out\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239735 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239735 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239505 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4901fc58-6f53-4175-ab3b-675370742c29-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239735 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239529 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.239735 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.239557 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340005 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.339972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340005 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340012 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340178 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4901fc58-6f53-4175-ab3b-675370742c29-config-out\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340178 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340178 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340089 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4901fc58-6f53-4175-ab3b-675370742c29-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340178 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340113 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340178 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340139 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340412 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340189 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340412 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340214 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-config\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340412 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340412 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340412 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340329 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340412 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-web-config\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340412 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340403 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340746 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmfsp\" (UniqueName: \"kubernetes.io/projected/4901fc58-6f53-4175-ab3b-675370742c29-kube-api-access-fmfsp\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340746 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340746 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340746 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340970 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340918 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.340970 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.340936 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.341781 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.341751 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.342924 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.342891 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4901fc58-6f53-4175-ab3b-675370742c29-config-out\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.343023 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.342953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4901fc58-6f53-4175-ab3b-675370742c29-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.344333 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.343093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.344333 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.343332 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.344333 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.343550 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-config\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.344333 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.343802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.344333 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.344030 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.344333 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.344273 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.344626 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.344444 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.344626 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.344511 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.345694 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.345671 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-web-config\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.345783 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.345765 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.346575 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.346553 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4901fc58-6f53-4175-ab3b-675370742c29-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.346875 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.346857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4901fc58-6f53-4175-ab3b-675370742c29-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.352024 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.352005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmfsp\" (UniqueName: \"kubernetes.io/projected/4901fc58-6f53-4175-ab3b-675370742c29-kube-api-access-fmfsp\") pod \"prometheus-k8s-0\" (UID: \"4901fc58-6f53-4175-ab3b-675370742c29\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.392499 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.392480 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:05.517607 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:05.517576 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:28:05.521453 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:28:05.521427 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4901fc58_6f53_4175_ab3b_675370742c29.slice/crio-03ceb6d6a079f6d9b539944f63d719394a82731d0de777ace9653027b5929b8c WatchSource:0}: Error finding container 03ceb6d6a079f6d9b539944f63d719394a82731d0de777ace9653027b5929b8c: Status 404 returned error can't find the container with id 03ceb6d6a079f6d9b539944f63d719394a82731d0de777ace9653027b5929b8c Apr 17 17:28:06.020352 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:06.020316 2570 generic.go:358] "Generic (PLEG): container finished" podID="4901fc58-6f53-4175-ab3b-675370742c29" containerID="d712f168a2e3c2b93aaf7d48a64811cc9ba73db8f5e150b6c687e992a71ad929" exitCode=0 Apr 17 17:28:06.020771 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:06.020403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4901fc58-6f53-4175-ab3b-675370742c29","Type":"ContainerDied","Data":"d712f168a2e3c2b93aaf7d48a64811cc9ba73db8f5e150b6c687e992a71ad929"} Apr 17 17:28:06.020771 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:06.020438 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4901fc58-6f53-4175-ab3b-675370742c29","Type":"ContainerStarted","Data":"03ceb6d6a079f6d9b539944f63d719394a82731d0de777ace9653027b5929b8c"} Apr 17 17:28:06.258947 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:06.258915 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75de9fb-e653-480b-868f-b20405e94293" path="/var/lib/kubelet/pods/c75de9fb-e653-480b-868f-b20405e94293/volumes" Apr 17 17:28:07.028771 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:07.028730 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4901fc58-6f53-4175-ab3b-675370742c29","Type":"ContainerStarted","Data":"9351f2d66db6ce43d7b4c617444d3f74dd860a915de6f337cd2b07647c874bbd"} Apr 17 17:28:07.028771 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:07.028772 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4901fc58-6f53-4175-ab3b-675370742c29","Type":"ContainerStarted","Data":"9812327f46f8603dbc402e27695e98493328440d88f35ae9ed5b76a25b2f9502"} Apr 17 17:28:07.029209 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:07.028782 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4901fc58-6f53-4175-ab3b-675370742c29","Type":"ContainerStarted","Data":"5ceb7ee6ef24f1d7434991cd95ebee441e1eab2017f9f5834ff659fd078e2a0d"} Apr 17 17:28:07.029209 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:07.028792 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4901fc58-6f53-4175-ab3b-675370742c29","Type":"ContainerStarted","Data":"d10511ce5f0cb468430f078e9ecf2523830b2aea8fff71ffa70e9a4459906045"} Apr 17 17:28:07.029209 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:07.028800 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4901fc58-6f53-4175-ab3b-675370742c29","Type":"ContainerStarted","Data":"d2acfcad75c497d2f18ba3a5b11d1f25074654186ff96460864bc3af106e10a8"} Apr 17 17:28:07.029209 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:07.028809 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4901fc58-6f53-4175-ab3b-675370742c29","Type":"ContainerStarted","Data":"8a28aca3c71e3808f1d43aa098fe1658159e22714a5e86e5640d2ac23a9e9c44"} Apr 17 17:28:07.060224 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:07.060172 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.060133263 podStartE2EDuration="2.060133263s" podCreationTimestamp="2026-04-17 17:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:28:07.058464562 +0000 UTC m=+247.437808376" watchObservedRunningTime="2026-04-17 17:28:07.060133263 +0000 UTC m=+247.439477077" Apr 17 17:28:10.392908 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:10.392873 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:28:11.083474 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:11.083444 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:28:11.085580 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:11.085551 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82dffe03-2b0c-4ac4-bb02-5a8430704805-metrics-certs\") pod \"network-metrics-daemon-zcn2s\" (UID: \"82dffe03-2b0c-4ac4-bb02-5a8430704805\") " pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:28:11.352874 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:11.352780 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7c8rp\"" Apr 17 17:28:11.360732 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:11.360717 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcn2s" Apr 17 17:28:11.473515 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:11.473492 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zcn2s"] Apr 17 17:28:11.476293 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:28:11.476271 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82dffe03_2b0c_4ac4_bb02_5a8430704805.slice/crio-8b1aa2abfa68183a60d24ea31c16e70c23bf5054e1e6dad3990e2b66bcba3d42 WatchSource:0}: Error finding container 8b1aa2abfa68183a60d24ea31c16e70c23bf5054e1e6dad3990e2b66bcba3d42: Status 404 returned error can't find the container with id 8b1aa2abfa68183a60d24ea31c16e70c23bf5054e1e6dad3990e2b66bcba3d42 Apr 17 17:28:12.044276 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:12.044241 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcn2s" event={"ID":"82dffe03-2b0c-4ac4-bb02-5a8430704805","Type":"ContainerStarted","Data":"8b1aa2abfa68183a60d24ea31c16e70c23bf5054e1e6dad3990e2b66bcba3d42"} Apr 17 17:28:13.048306 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:13.048270 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcn2s" event={"ID":"82dffe03-2b0c-4ac4-bb02-5a8430704805","Type":"ContainerStarted","Data":"c4447c664eca7f91937a619dbc372f36b58624bbc75e168fa342f2c10ce1c89d"} Apr 17 17:28:13.048306 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:13.048308 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcn2s" event={"ID":"82dffe03-2b0c-4ac4-bb02-5a8430704805","Type":"ContainerStarted","Data":"aa094cd249c0f36d27cf6a82e4190dbbd0a58b848cf6f8bf184ee0de9ccc3282"} Apr 17 17:28:13.065858 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:13.065806 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zcn2s" podStartSLOduration=252.120387451 podStartE2EDuration="4m13.065792284s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:28:11.478533327 +0000 UTC m=+251.857877134" lastFinishedPulling="2026-04-17 17:28:12.423938164 +0000 UTC m=+252.803281967" observedRunningTime="2026-04-17 17:28:13.064570609 +0000 UTC m=+253.443914436" watchObservedRunningTime="2026-04-17 17:28:13.065792284 +0000 UTC m=+253.445136163" Apr 17 17:28:43.646703 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.646658 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7xvgj"] Apr 17 17:28:43.649795 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.649779 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.652704 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.652680 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:28:43.656849 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.656490 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7xvgj"] Apr 17 17:28:43.708031 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.707994 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c39514c7-a4e3-4184-a579-f8a32cca8b54-original-pull-secret\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.708223 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.708061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c39514c7-a4e3-4184-a579-f8a32cca8b54-dbus\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.708223 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.708079 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c39514c7-a4e3-4184-a579-f8a32cca8b54-kubelet-config\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.808474 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.808441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c39514c7-a4e3-4184-a579-f8a32cca8b54-dbus\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.808474 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.808476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c39514c7-a4e3-4184-a579-f8a32cca8b54-kubelet-config\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.808635 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.808520 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c39514c7-a4e3-4184-a579-f8a32cca8b54-original-pull-secret\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.808671 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.808633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c39514c7-a4e3-4184-a579-f8a32cca8b54-dbus\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.808671 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.808641 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c39514c7-a4e3-4184-a579-f8a32cca8b54-kubelet-config\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.810725 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.810708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c39514c7-a4e3-4184-a579-f8a32cca8b54-original-pull-secret\") pod \"global-pull-secret-syncer-7xvgj\" (UID: \"c39514c7-a4e3-4184-a579-f8a32cca8b54\") " pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:43.959979 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:43.959916 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7xvgj" Apr 17 17:28:44.072210 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:44.072183 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7xvgj"] Apr 17 17:28:44.076599 ip-10-0-136-202 kubenswrapper[2570]: W0417 17:28:44.076572 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc39514c7_a4e3_4184_a579_f8a32cca8b54.slice/crio-6079602b287b671b39629cbb5369c7b8d89a44d9376a9da50db9d2536effeb59 WatchSource:0}: Error finding container 6079602b287b671b39629cbb5369c7b8d89a44d9376a9da50db9d2536effeb59: Status 404 returned error can't find the container with id 6079602b287b671b39629cbb5369c7b8d89a44d9376a9da50db9d2536effeb59 Apr 17 17:28:44.134638 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:44.134609 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7xvgj" event={"ID":"c39514c7-a4e3-4184-a579-f8a32cca8b54","Type":"ContainerStarted","Data":"6079602b287b671b39629cbb5369c7b8d89a44d9376a9da50db9d2536effeb59"} Apr 17 17:28:48.148221 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:48.148182 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7xvgj" event={"ID":"c39514c7-a4e3-4184-a579-f8a32cca8b54","Type":"ContainerStarted","Data":"415aa686a3de786d62150d4d772a101613e36c273c60fc582a6ce762c95c237f"} Apr 17 17:28:48.162814 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:28:48.162768 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7xvgj" podStartSLOduration=1.5002728269999999 podStartE2EDuration="5.162755042s" podCreationTimestamp="2026-04-17 17:28:43 +0000 UTC" firstStartedPulling="2026-04-17 17:28:44.078397709 +0000 UTC m=+284.457741501" lastFinishedPulling="2026-04-17 17:28:47.74087991 +0000 UTC m=+288.120223716" observedRunningTime="2026-04-17 17:28:48.161860107 +0000 UTC m=+288.541203923" watchObservedRunningTime="2026-04-17 17:28:48.162755042 +0000 UTC m=+288.542098855" Apr 17 17:29:00.148598 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:29:00.148566 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:29:05.392700 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:29:05.392655 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:05.408735 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:29:05.408709 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:06.212653 ip-10-0-136-202 kubenswrapper[2570]: I0417 17:29:06.212628 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:12:12.004439 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.004403 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xhv8p/must-gather-gvbd6"] Apr 17 18:12:12.007885 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.007867 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:12.010072 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.010050 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xhv8p\"/\"kube-root-ca.crt\"" Apr 17 18:12:12.010911 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.010893 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xhv8p\"/\"default-dockercfg-hk9hw\"" Apr 17 18:12:12.010980 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.010908 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xhv8p\"/\"openshift-service-ca.crt\"" Apr 17 18:12:12.026622 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.026595 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xhv8p/must-gather-gvbd6"] Apr 17 18:12:12.108349 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.108316 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/849e3010-c633-41eb-8a2f-a7ee563a584d-must-gather-output\") pod \"must-gather-gvbd6\" (UID: \"849e3010-c633-41eb-8a2f-a7ee563a584d\") " pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:12.108518 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.108381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvlp\" (UniqueName: \"kubernetes.io/projected/849e3010-c633-41eb-8a2f-a7ee563a584d-kube-api-access-tsvlp\") pod \"must-gather-gvbd6\" (UID: \"849e3010-c633-41eb-8a2f-a7ee563a584d\") " pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:12.209597 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.209565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/849e3010-c633-41eb-8a2f-a7ee563a584d-must-gather-output\") pod \"must-gather-gvbd6\" (UID: \"849e3010-c633-41eb-8a2f-a7ee563a584d\") " pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:12.209717 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.209627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvlp\" (UniqueName: \"kubernetes.io/projected/849e3010-c633-41eb-8a2f-a7ee563a584d-kube-api-access-tsvlp\") pod \"must-gather-gvbd6\" (UID: \"849e3010-c633-41eb-8a2f-a7ee563a584d\") " pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:12.209935 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.209913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/849e3010-c633-41eb-8a2f-a7ee563a584d-must-gather-output\") pod \"must-gather-gvbd6\" (UID: \"849e3010-c633-41eb-8a2f-a7ee563a584d\") " pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:12.216875 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.216852 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvlp\" (UniqueName: \"kubernetes.io/projected/849e3010-c633-41eb-8a2f-a7ee563a584d-kube-api-access-tsvlp\") pod \"must-gather-gvbd6\" (UID: \"849e3010-c633-41eb-8a2f-a7ee563a584d\") " pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:12.316392 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.316340 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:12.428519 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.428490 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xhv8p/must-gather-gvbd6"] Apr 17 18:12:12.431509 ip-10-0-136-202 kubenswrapper[2570]: W0417 18:12:12.431476 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod849e3010_c633_41eb_8a2f_a7ee563a584d.slice/crio-9829c86e27390647e9ae69277787def7f68a4c8fbc8cf9ccb3d6a45a381e4055 WatchSource:0}: Error finding container 9829c86e27390647e9ae69277787def7f68a4c8fbc8cf9ccb3d6a45a381e4055: Status 404 returned error can't find the container with id 9829c86e27390647e9ae69277787def7f68a4c8fbc8cf9ccb3d6a45a381e4055 Apr 17 18:12:12.433134 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:12.433116 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:12:13.086985 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:13.086953 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" event={"ID":"849e3010-c633-41eb-8a2f-a7ee563a584d","Type":"ContainerStarted","Data":"9829c86e27390647e9ae69277787def7f68a4c8fbc8cf9ccb3d6a45a381e4055"} Apr 17 18:12:18.103518 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:18.103480 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" event={"ID":"849e3010-c633-41eb-8a2f-a7ee563a584d","Type":"ContainerStarted","Data":"d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43"} Apr 17 18:12:19.108084 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:19.108046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" event={"ID":"849e3010-c633-41eb-8a2f-a7ee563a584d","Type":"ContainerStarted","Data":"d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889"} Apr 17 18:12:19.122542 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:19.122492 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" podStartSLOduration=2.609184441 podStartE2EDuration="8.122479339s" podCreationTimestamp="2026-04-17 18:12:11 +0000 UTC" firstStartedPulling="2026-04-17 18:12:12.433308049 +0000 UTC m=+2892.812651844" lastFinishedPulling="2026-04-17 18:12:17.946602946 +0000 UTC m=+2898.325946742" observedRunningTime="2026-04-17 18:12:19.121946587 +0000 UTC m=+2899.501290401" watchObservedRunningTime="2026-04-17 18:12:19.122479339 +0000 UTC m=+2899.501823152" Apr 17 18:12:36.158285 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:36.158254 2570 generic.go:358] "Generic (PLEG): container finished" podID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerID="d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43" exitCode=0 Apr 17 18:12:36.158681 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:36.158311 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" event={"ID":"849e3010-c633-41eb-8a2f-a7ee563a584d","Type":"ContainerDied","Data":"d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43"} Apr 17 18:12:36.158681 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:36.158605 2570 scope.go:117] "RemoveContainer" containerID="d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43" Apr 17 18:12:36.690238 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:36.690200 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xhv8p_must-gather-gvbd6_849e3010-c633-41eb-8a2f-a7ee563a584d/gather/0.log" Apr 17 18:12:39.886797 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:39.886769 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7xvgj_c39514c7-a4e3-4184-a579-f8a32cca8b54/global-pull-secret-syncer/0.log" Apr 17 18:12:40.030567 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:40.030536 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-n6fpq_eb68dc81-dce9-4f3e-bf24-94eff6bc04bf/konnectivity-agent/0.log" Apr 17 18:12:40.149909 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:40.149844 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-202.ec2.internal_697494af72639fc0eb382c71525a5808/haproxy/0.log" Apr 17 18:12:42.109523 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.109477 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xhv8p/must-gather-gvbd6"] Apr 17 18:12:42.110046 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.109757 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerName="copy" containerID="cri-o://d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889" gracePeriod=2 Apr 17 18:12:42.111064 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.111037 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xhv8p/must-gather-gvbd6"] Apr 17 18:12:42.111958 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.111925 2570 status_manager.go:895] "Failed to get status for pod" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" err="pods \"must-gather-gvbd6\" is forbidden: User \"system:node:ip-10-0-136-202.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xhv8p\": no relationship found between node 'ip-10-0-136-202.ec2.internal' and this object" Apr 17 18:12:42.329068 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.329050 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xhv8p_must-gather-gvbd6_849e3010-c633-41eb-8a2f-a7ee563a584d/copy/0.log" Apr 17 18:12:42.329385 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.329370 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:42.440451 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.440395 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsvlp\" (UniqueName: \"kubernetes.io/projected/849e3010-c633-41eb-8a2f-a7ee563a584d-kube-api-access-tsvlp\") pod \"849e3010-c633-41eb-8a2f-a7ee563a584d\" (UID: \"849e3010-c633-41eb-8a2f-a7ee563a584d\") " Apr 17 18:12:42.440451 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.440446 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/849e3010-c633-41eb-8a2f-a7ee563a584d-must-gather-output\") pod \"849e3010-c633-41eb-8a2f-a7ee563a584d\" (UID: \"849e3010-c633-41eb-8a2f-a7ee563a584d\") " Apr 17 18:12:42.441945 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.441922 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849e3010-c633-41eb-8a2f-a7ee563a584d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "849e3010-c633-41eb-8a2f-a7ee563a584d" (UID: "849e3010-c633-41eb-8a2f-a7ee563a584d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:12:42.442483 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.442457 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849e3010-c633-41eb-8a2f-a7ee563a584d-kube-api-access-tsvlp" (OuterVolumeSpecName: "kube-api-access-tsvlp") pod "849e3010-c633-41eb-8a2f-a7ee563a584d" (UID: "849e3010-c633-41eb-8a2f-a7ee563a584d"). InnerVolumeSpecName "kube-api-access-tsvlp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:12:42.541078 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.541048 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/849e3010-c633-41eb-8a2f-a7ee563a584d-must-gather-output\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 18:12:42.541078 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:42.541076 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tsvlp\" (UniqueName: \"kubernetes.io/projected/849e3010-c633-41eb-8a2f-a7ee563a584d-kube-api-access-tsvlp\") on node \"ip-10-0-136-202.ec2.internal\" DevicePath \"\"" Apr 17 18:12:43.178436 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.178407 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xhv8p_must-gather-gvbd6_849e3010-c633-41eb-8a2f-a7ee563a584d/copy/0.log" Apr 17 18:12:43.178873 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.178815 2570 generic.go:358] "Generic (PLEG): container finished" podID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerID="d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889" exitCode=143 Apr 17 18:12:43.178933 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.178877 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhv8p/must-gather-gvbd6" Apr 17 18:12:43.178933 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.178919 2570 scope.go:117] "RemoveContainer" containerID="d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889" Apr 17 18:12:43.188326 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.188227 2570 scope.go:117] "RemoveContainer" containerID="d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43" Apr 17 18:12:43.200325 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.200301 2570 scope.go:117] "RemoveContainer" containerID="d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889" Apr 17 18:12:43.200597 ip-10-0-136-202 kubenswrapper[2570]: E0417 18:12:43.200576 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889\": container with ID starting with d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889 not found: ID does not exist" containerID="d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889" Apr 17 18:12:43.200643 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.200609 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889"} err="failed to get container status \"d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889\": rpc error: code = NotFound desc = could not find container \"d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889\": container with ID starting with d5831a18bd43e514bb9ee6d4d9f83001b7dbe72b32315925dc892ece4df90889 not found: ID does not exist" Apr 17 18:12:43.200643 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.200637 2570 scope.go:117] "RemoveContainer" containerID="d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43" Apr 17 18:12:43.200903 ip-10-0-136-202 kubenswrapper[2570]: E0417 18:12:43.200883 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43\": container with ID starting with d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43 not found: ID does not exist" containerID="d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43" Apr 17 18:12:43.200969 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.200926 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43"} err="failed to get container status \"d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43\": rpc error: code = NotFound desc = could not find container \"d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43\": container with ID starting with d55c2e9b7243b16493b377d2a8835cd04ec3afd45896a3f6908fa39ccbcfac43 not found: ID does not exist" Apr 17 18:12:43.782742 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.782715 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6f6c8f44fb-k2cn9_8ffbee18-703d-4c32-ac71-d0a0076a1b98/metrics-server/0.log" Apr 17 18:12:43.982159 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.982125 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtltb_783c23c8-2363-4b4e-bc25-560321d31f0d/node-exporter/0.log" Apr 17 18:12:43.998949 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:43.998927 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtltb_783c23c8-2363-4b4e-bc25-560321d31f0d/kube-rbac-proxy/0.log" Apr 17 18:12:44.016739 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.016722 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtltb_783c23c8-2363-4b4e-bc25-560321d31f0d/init-textfile/0.log" Apr 17 18:12:44.122972 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.122922 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4901fc58-6f53-4175-ab3b-675370742c29/prometheus/0.log" Apr 17 18:12:44.136391 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.136373 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4901fc58-6f53-4175-ab3b-675370742c29/config-reloader/0.log" Apr 17 18:12:44.161082 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.161063 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4901fc58-6f53-4175-ab3b-675370742c29/thanos-sidecar/0.log" Apr 17 18:12:44.178304 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.178280 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4901fc58-6f53-4175-ab3b-675370742c29/kube-rbac-proxy-web/0.log" Apr 17 18:12:44.198727 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.198709 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4901fc58-6f53-4175-ab3b-675370742c29/kube-rbac-proxy/0.log" Apr 17 18:12:44.219689 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.219669 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4901fc58-6f53-4175-ab3b-675370742c29/kube-rbac-proxy-thanos/0.log" Apr 17 18:12:44.237097 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.237081 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4901fc58-6f53-4175-ab3b-675370742c29/init-config-reloader/0.log" Apr 17 18:12:44.252957 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:44.252933 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" path="/var/lib/kubelet/pods/849e3010-c633-41eb-8a2f-a7ee563a584d/volumes" Apr 17 18:12:47.181204 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.181173 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7"] Apr 17 18:12:47.181619 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.181423 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerName="copy" Apr 17 18:12:47.181619 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.181435 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerName="copy" Apr 17 18:12:47.181619 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.181444 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerName="gather" Apr 17 18:12:47.181619 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.181450 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerName="gather" Apr 17 18:12:47.181619 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.181486 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerName="gather" Apr 17 18:12:47.181619 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.181495 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="849e3010-c633-41eb-8a2f-a7ee563a584d" containerName="copy" Apr 17 18:12:47.184401 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.184385 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.186999 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.186974 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kkzt9\"/\"kube-root-ca.crt\"" Apr 17 18:12:47.187120 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.187005 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kkzt9\"/\"openshift-service-ca.crt\"" Apr 17 18:12:47.188213 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.188197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kkzt9\"/\"default-dockercfg-jtmx4\"" Apr 17 18:12:47.194246 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.194222 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7"] Apr 17 18:12:47.273654 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.273623 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-proc\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.273654 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.273661 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-sys\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.273906 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.273682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-lib-modules\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.273906 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.273802 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-podres\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.273906 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.273848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l5l8\" (UniqueName: \"kubernetes.io/projected/2283a251-6e74-4c1f-9183-000957ab8547-kube-api-access-8l5l8\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375095 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375067 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-podres\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375269 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l5l8\" (UniqueName: \"kubernetes.io/projected/2283a251-6e74-4c1f-9183-000957ab8547-kube-api-access-8l5l8\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375269 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375135 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-proc\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375269 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-sys\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375269 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375207 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-lib-modules\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375269 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375240 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-podres\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375269 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-proc\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375488 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-sys\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.375488 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.375358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2283a251-6e74-4c1f-9183-000957ab8547-lib-modules\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.387107 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.387075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l5l8\" (UniqueName: \"kubernetes.io/projected/2283a251-6e74-4c1f-9183-000957ab8547-kube-api-access-8l5l8\") pod \"perf-node-gather-daemonset-6mck7\" (UID: \"2283a251-6e74-4c1f-9183-000957ab8547\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.494795 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.494721 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:47.617773 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.617742 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7"] Apr 17 18:12:47.620608 ip-10-0-136-202 kubenswrapper[2570]: W0417 18:12:47.620578 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2283a251_6e74_4c1f_9183_000957ab8547.slice/crio-102bb771b2f2e61260c90d1e80b7500498692a1a7b15509fa1f447e32cc26ea7 WatchSource:0}: Error finding container 102bb771b2f2e61260c90d1e80b7500498692a1a7b15509fa1f447e32cc26ea7: Status 404 returned error can't find the container with id 102bb771b2f2e61260c90d1e80b7500498692a1a7b15509fa1f447e32cc26ea7 Apr 17 18:12:47.654099 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.654078 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qbjvx_5d8aa389-0914-474c-8a7a-11a05cb1ee33/dns/0.log" Apr 17 18:12:47.677431 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.677403 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qbjvx_5d8aa389-0914-474c-8a7a-11a05cb1ee33/kube-rbac-proxy/0.log" Apr 17 18:12:47.766460 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:47.766441 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b6njn_e75ff2ec-f311-467f-a5b6-86322293f3ed/dns-node-resolver/0.log" Apr 17 18:12:48.195676 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:48.195598 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" event={"ID":"2283a251-6e74-4c1f-9183-000957ab8547","Type":"ContainerStarted","Data":"84a80b3dec048bc4ad78499f143bfeb812b976a434f061e00c71ba63e87f2cae"} Apr 17 18:12:48.195676 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:48.195635 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" event={"ID":"2283a251-6e74-4c1f-9183-000957ab8547","Type":"ContainerStarted","Data":"102bb771b2f2e61260c90d1e80b7500498692a1a7b15509fa1f447e32cc26ea7"} Apr 17 18:12:48.196028 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:48.195700 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:48.212668 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:48.212613 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" podStartSLOduration=1.212595425 podStartE2EDuration="1.212595425s" podCreationTimestamp="2026-04-17 18:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:12:48.211021637 +0000 UTC m=+2928.590365461" watchObservedRunningTime="2026-04-17 18:12:48.212595425 +0000 UTC m=+2928.591939240" Apr 17 18:12:48.213705 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:48.213679 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-76d6f5448c-dqvzx_6dc52916-462a-4208-9837-dd6fbb842e70/registry/0.log" Apr 17 18:12:48.233924 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:48.233891 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-76d6f5448c-dqvzx_6dc52916-462a-4208-9837-dd6fbb842e70/registry/1.log" Apr 17 18:12:48.302441 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:48.302418 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ks8m2_8947a5af-e839-4e78-8aef-37f0885ea400/node-ca/0.log" Apr 17 18:12:49.320281 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:49.320246 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-szcn9_e067e25c-97d9-490a-b6a9-e17550577b5e/serve-healthcheck-canary/0.log" Apr 17 18:12:49.685655 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:49.685569 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-88mpn_7f76d408-5470-4147-8801-3af404c3630b/kube-rbac-proxy/0.log" Apr 17 18:12:49.703267 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:49.703248 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-88mpn_7f76d408-5470-4147-8801-3af404c3630b/exporter/0.log" Apr 17 18:12:49.726436 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:49.726413 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-88mpn_7f76d408-5470-4147-8801-3af404c3630b/extractor/0.log" Apr 17 18:12:54.208136 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:54.208102 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-6mck7" Apr 17 18:12:57.314349 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.314321 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j4mx4_02eb784a-744c-4eac-bec8-46a3df516ca9/kube-multus-additional-cni-plugins/0.log" Apr 17 18:12:57.332356 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.332334 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j4mx4_02eb784a-744c-4eac-bec8-46a3df516ca9/egress-router-binary-copy/0.log" Apr 17 18:12:57.349521 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.349503 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j4mx4_02eb784a-744c-4eac-bec8-46a3df516ca9/cni-plugins/0.log" Apr 17 18:12:57.367660 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.367641 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j4mx4_02eb784a-744c-4eac-bec8-46a3df516ca9/bond-cni-plugin/0.log" Apr 17 18:12:57.385725 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.385705 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j4mx4_02eb784a-744c-4eac-bec8-46a3df516ca9/routeoverride-cni/0.log" Apr 17 18:12:57.404879 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.404860 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j4mx4_02eb784a-744c-4eac-bec8-46a3df516ca9/whereabouts-cni-bincopy/0.log" Apr 17 18:12:57.422784 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.422684 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j4mx4_02eb784a-744c-4eac-bec8-46a3df516ca9/whereabouts-cni/0.log" Apr 17 18:12:57.475387 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.475369 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xtx74_9f3a808b-95e3-410b-bcf1-257bf1254f04/kube-multus/0.log" Apr 17 18:12:57.574457 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.574394 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zcn2s_82dffe03-2b0c-4ac4-bb02-5a8430704805/network-metrics-daemon/0.log" Apr 17 18:12:57.591415 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:57.591380 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zcn2s_82dffe03-2b0c-4ac4-bb02-5a8430704805/kube-rbac-proxy/0.log" Apr 17 18:12:58.365845 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:58.365768 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42tv5_0a83ec74-42da-427f-be72-02e777b9626c/ovn-controller/0.log" Apr 17 18:12:58.405719 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:58.405698 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42tv5_0a83ec74-42da-427f-be72-02e777b9626c/ovn-acl-logging/0.log" Apr 17 18:12:58.428777 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:58.428756 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42tv5_0a83ec74-42da-427f-be72-02e777b9626c/kube-rbac-proxy-node/0.log" Apr 17 18:12:58.451215 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:58.451165 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42tv5_0a83ec74-42da-427f-be72-02e777b9626c/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:12:58.467998 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:58.467982 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42tv5_0a83ec74-42da-427f-be72-02e777b9626c/northd/0.log" Apr 17 18:12:58.487255 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:58.487223 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42tv5_0a83ec74-42da-427f-be72-02e777b9626c/nbdb/0.log" Apr 17 18:12:58.508982 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:58.508960 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42tv5_0a83ec74-42da-427f-be72-02e777b9626c/sbdb/0.log" Apr 17 18:12:58.664456 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:12:58.664398 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42tv5_0a83ec74-42da-427f-be72-02e777b9626c/ovnkube-controller/0.log" Apr 17 18:13:00.351942 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:13:00.351921 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xbzwj_3b7788b5-7a25-42a0-8536-8a543283bb1e/network-check-target-container/0.log" Apr 17 18:13:01.194889 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:13:01.194865 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-gbft6_7683382f-975a-43b6-9d75-7d14283c2328/iptables-alerter/0.log" Apr 17 18:13:01.804431 ip-10-0-136-202 kubenswrapper[2570]: I0417 18:13:01.804399 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-426x5_1f813146-daee-4a79-a436-af839213c97f/tuned/0.log"